Apr 24 21:24:56.526480 ip-10-0-129-230 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:56.526492 ip-10-0-129-230 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:56.526502 ip-10-0-129-230 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:56.526819 ip-10-0-129-230 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:25:07.632036 ip-10-0-129-230 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:25:07.632054 ip-10-0-129-230 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a8be4e3255b449e2a197506763d3faa4 -- Apr 24 21:27:27.478385 ip-10-0-129-230 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:27.851521 ip-10-0-129-230 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:27.851521 ip-10-0-129-230 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:27.851521 ip-10-0-129-230 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:27.851521 ip-10-0-129-230 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:27.851521 ip-10-0-129-230 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:27.854511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.854443 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:27.861066 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861053 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:27.861066 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861066 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861070 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861073 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861076 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861079 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861082 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861085 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861087 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861090 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861092 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861095 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861098 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861100 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861103 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861105 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861108 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861111 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861113 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861116 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861118 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:27.861136 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861121 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861123 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861126 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861129 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861132 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861134 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861137 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861139 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861142 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861153 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861156 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861158 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861160 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861163 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861166 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861168 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861171 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861173 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861176 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861178 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:27.861611 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861181 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861183 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861185 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861187 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861190 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861192 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861195 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861199 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861203 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861208 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861211 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861213 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861216 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861218 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861222 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861225 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861228 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861230 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861233 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861235 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:27.862145 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861238 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861242 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861245 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861248 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861251 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861253 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861256 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861259 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861262 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861264 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861267 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861269 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861271 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861274 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861276 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861278 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861281 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861283 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861285 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861288 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:27.862618 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861291 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861294 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861297 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861299 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.861302 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862880 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862888 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862892 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862894 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862898 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862901 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862904 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862906 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862909 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862912 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862915 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862917 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862920 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862923 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:27.863106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862925 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862927 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862930 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862932 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862935 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862937 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862939 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862942 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862945 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862949 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862952 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862954 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862957 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862959 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862962 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862966 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862969 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862971 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862974 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862976 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:27.863565 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862980 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862982 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862985 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862987 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862990 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862992 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862995 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862997 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.862999 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863002 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863004 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863007 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863009 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863011 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863014 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863016 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863033 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863035 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863039 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863043 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:27.864106 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863047 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863050 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863053 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863056 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863058 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863061 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863064 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863067 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863070 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863073 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863076 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863078 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863081 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863083 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863086 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863088 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863091 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863094 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863096 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863099 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:27.864594 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863101 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863103 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863106 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863108 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863111 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863114 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863117 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863119 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863122 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863124 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863127 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863129 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863191 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863197 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863203 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863207 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863211 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863215 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863220 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863225 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863229 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:27.865091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863232 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863235 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863238 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863241 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863244 2570 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863247 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863250 2570 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863253 2570 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863255 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863258 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863261 2570 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863264 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863267 2570 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863270 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863274 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863277 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863280 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863283 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863286 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863289 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863292 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863295 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863298 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863301 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863305 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:27.865610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863308 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863310 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863313 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863316 2570 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863319 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863323 2570 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863327 2570 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863330 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863333 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863336 2570 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863339 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863342 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863345 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863348 2570 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863351 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863354 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863357 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863360 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863362 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863365 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863368 2570 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863372 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863375 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863378 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863381 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863385 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:27.866241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863388 2570 flags.go:64] FLAG: --help="false" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863390 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-129-230.ec2.internal" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863393 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863396 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863399 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863403 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863406 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863409 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863411 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863414 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863417 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863420 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863423 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863425 2570 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863428 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863431 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863434 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863437 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863439 2570 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863442 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863445 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863448 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863453 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:27.866863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863455 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863458 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863461 2570 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863463 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863469 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863472 2570 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863475 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863478 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863481 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863486 2570 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863489 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863492 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863495 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863498 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863500 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863503 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863506 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863513 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863517 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863520 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863522 2570 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863525 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863531 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863534 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:27.867420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863537 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863540 2570 flags.go:64] FLAG: --port="10250" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863543 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863546 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01d1423693c7eb67a" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863549 2570 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863552 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863554 2570 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863557 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863560 2570 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863563 2570 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863566 2570 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863569 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863572 2570 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863579 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863582 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863585 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863588 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863591 2570 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863593 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863596 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863599 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863602 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863605 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863608 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863611 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863614 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:27.868148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863616 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863619 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863622 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863625 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863627 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863630 2570 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863633 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863638 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863641 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863644 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863647 2570 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863650 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863653 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863655 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863658 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863661 2570 flags.go:64] FLAG: --v="2" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863665 2570 flags.go:64] FLAG: --version="false" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863668 2570 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863672 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.863676 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863760 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863763 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863767 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863769 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:27.868761 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863772 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863775 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863777 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863780 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863783 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863785 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863788 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863790 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863792 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863795 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863797 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863800 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863803 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863807 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863809 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863812 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863815 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863817 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863820 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863822 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:27.869392 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863824 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863827 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863829 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863832 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863834 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863836 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863839 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863843 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863845 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863847 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863850 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863852 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863855 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863859 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863861 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863864 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863866 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863868 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863871 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863873 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:27.869882 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863876 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863878 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863881 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863883 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863886 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863888 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863890 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863893 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863895 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863898 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863900 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863903 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863906 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863908 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863910 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863913 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863915 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863917 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863920 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863923 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:27.870556 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863926 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863928 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863931 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863934 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863936 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863940 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863943 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863945 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863948 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863950 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863953 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863955 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863957 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863960 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863962 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863964 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863967 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863969 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863972 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:27.871304 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863974 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:27.871846 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863978 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:27.871846 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.863981 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:27.871846 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.864668 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:27.872193 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.872175 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:27.872226 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.872194 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:27.872258 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872244 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:27.872258 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872249 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:27.872258 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872253 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:27.872258 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872256 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:27.872258 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872259 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872263 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872266 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872269 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872271 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872274 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872278 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872283 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872286 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872288 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872291 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872294 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872297 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872299 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872302 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872304 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872307 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872310 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872312 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:27.872388 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872315 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872318 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872321 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872323 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872326 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872329 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872331 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872334 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872337 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872340 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872342 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872345 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872347 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872350 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872353 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872356 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872358 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872361 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872365 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872369 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:27.872850 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872372 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872374 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872377 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872381 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872383 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872386 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872388 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872391 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872393 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872396 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872399 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872402 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872404 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872407 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872410 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872412 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872415 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872417 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872420 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872423 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:27.873437 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872425 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872427 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872430 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872433 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872435 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872438 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872440 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872443 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872446 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872449 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872451 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872454 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872456 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872459 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872462 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872464 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872467 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872469 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872472 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872474 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:27.873928 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872477 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872480 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872482 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.872487 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872583 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872589 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872592 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872595 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872598 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872601 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872604 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872608 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872612 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872615 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872618 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:27.874431 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872621 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872623 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872626 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872628 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872631 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872634 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872637 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872640 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872642 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872645 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872647 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872651 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872653 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872656 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872658 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872661 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872664 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872666 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872668 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872671 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:27.874801 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872673 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872675 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872678 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872680 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872683 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872685 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872688 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872691 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872693 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872696 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872698 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872700 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872703 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872705 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872708 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872710 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872712 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872716 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872718 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872720 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:27.875308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872723 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872726 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872728 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872730 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872734 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872736 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872739 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872741 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872743 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872746 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872748 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872750 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872753 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872756 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872758 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872760 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872763 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872765 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872768 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872770 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:27.875793 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872773 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872775 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872778 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872780 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872782 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872785 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872787 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872790 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872792 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872795 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872797 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872799 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872802 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872804 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:27.872807 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:27.876290 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.872812 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:27.876644 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.873530 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:27.876644 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.876164 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:27.877100 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.877039 2570 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:27.877165 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.877139 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:27.877199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.877194 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:27.900184 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.900167 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:27.902562 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.902534 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:27.919113 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.919092 2570 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:27.925322 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.925305 2570 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:27.926168 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.926153 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:27.926493 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.926478 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:27.929920 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.929901 2570 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b815a05b-9302-41c9-afc9-d8a666c2fed8:/dev/nvme0n1p3 cf46a1ef-dcaf-4466-8133-be51618144c8:/dev/nvme0n1p4] Apr 24 21:27:27.929964 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.929921 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:27.935441 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.935332 2570 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:27.933536488 +0000 UTC m=+0.357043885 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098389 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2edcf7628012c2d087cfda3219d6d8 SystemUUID:ec2edcf7-6280-12c2-d087-cfda3219d6d8 BootID:a8be4e32-55b4-49e2-a197-506763d3faa4 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:69:69:8b:7d:01 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:69:69:8b:7d:01 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:ac:cf:f1:38:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:27.935441 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.935437 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:27.935562 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.935536 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:27.936383 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.936358 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:27.936514 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.936386 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-230.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:27.936558 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.936523 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:27.936558 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.936530 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:27.936558 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.936543 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:27.937340 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.937330 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:27.938813 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.938803 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:27.938911 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.938902 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:27.941199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.941189 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:27.941242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.941214 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:27.941242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.941229 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:27.941242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.941242 2570 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:27.941369 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.941253 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:27.942300 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.942288 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:27.942356 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.942306 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:27.945565 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.945550 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:27.947128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.947114 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:27.948168 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948151 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:27.948168 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948171 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948177 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948182 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948190 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948196 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948203 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948211 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948221 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948229 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948242 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:27.948267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948250 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:27.948986 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948967 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:27.949074 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.948989 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:27.952072 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.952018 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:27.952072 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.952018 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:27.952166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.952079 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-230.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:27.952771 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.952758 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:27.952800 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.952793 2570 server.go:1295] "Started kubelet" Apr 24 21:27:27.952908 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.952876 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:27.952975 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.952890 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:27.952975 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.952962 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:27.953567 ip-10-0-129-230 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:27.954425 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.954356 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:27.959107 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.959081 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:27.961534 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.961518 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:27.962157 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.962141 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:27.963822 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.960393 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-230.ec2.internal.18a96827c2a1d85d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-230.ec2.internal,UID:ip-10-0-129-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-230.ec2.internal,},FirstTimestamp:2026-04-24 21:27:27.952771165 +0000 UTC m=+0.376278566,LastTimestamp:2026-04-24 21:27:27.952771165 +0000 UTC m=+0.376278566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-230.ec2.internal,}" Apr 24 21:27:27.963925 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.963912 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:27.964284 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964263 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:27.964284 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964269 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:27.964602 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964300 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:27.964602 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964403 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:27.964602 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964415 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:27.964602 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.964557 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:27.964835 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964821 2570 factory.go:153] Registering CRI-O factory Apr 24 21:27:27.964835 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964836 2570 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:27.964948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964881 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:27.964948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964890 2570 factory.go:55] Registering systemd factory Apr 24 21:27:27.964948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964896 2570 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:27.964948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964913 2570 factory.go:103] Registering Raw factory Apr 24 21:27:27.964948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.964922 2570 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:27.965382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.965360 2570 manager.go:319] Starting recovery of all containers Apr 24 21:27:27.974570 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.974555 2570 manager.go:324] Recovery completed Apr 24 21:27:27.977151 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.977116 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-230.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:27.977151 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.977118 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:27.979011 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.978995 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:27.981328 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.981308 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:27.981401 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.981341 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:27.981401 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.981350 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:27.981824 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.981807 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:27.981824 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.981824 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:27.981904 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.981839 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:27.983860 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.983848 2570 policy_none.go:49] "None policy: Start" Apr 24 21:27:27.983893 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.983864 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:27.983893 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:27.983873 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:27.990150 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.990093 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-230.ec2.internal.18a96827c45595a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-230.ec2.internal,UID:ip-10-0-129-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-230.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-230.ec2.internal,},FirstTimestamp:2026-04-24 21:27:27.981327776 +0000 UTC m=+0.404835173,LastTimestamp:2026-04-24 21:27:27.981327776 +0000 UTC m=+0.404835173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-230.ec2.internal,}" Apr 24 21:27:27.998506 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:27.998450 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-230.ec2.internal.18a96827c455db13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-230.ec2.internal,UID:ip-10-0-129-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-230.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-230.ec2.internal,},FirstTimestamp:2026-04-24 21:27:27.981345555 +0000 UTC m=+0.404852954,LastTimestamp:2026-04-24 21:27:27.981345555 +0000 UTC m=+0.404852954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-230.ec2.internal,}" Apr 24 21:27:28.005081 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.005058 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wnmnv" Apr 24 21:27:28.007150 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.007011 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-230.ec2.internal.18a96827c455fed4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-230.ec2.internal,UID:ip-10-0-129-230.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-230.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-230.ec2.internal,},FirstTimestamp:2026-04-24 21:27:27.981354708 +0000 UTC m=+0.404862106,LastTimestamp:2026-04-24 21:27:27.981354708 +0000 UTC m=+0.404862106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-230.ec2.internal,}" Apr 24 21:27:28.012995 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.012979 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wnmnv" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.016699 2570 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.016736 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.016748 2570 server.go:85] "Starting device plugin registration server" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.016993 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.017005 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.017539 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.017608 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:28.018561 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.017618 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:28.032092 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.018580 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:28.032092 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.018621 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.061040 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.060996 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:28.062288 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.062268 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:28.062367 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.062293 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:28.062367 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.062307 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:28.062367 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.062314 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:28.062483 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.062370 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:28.065489 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.065473 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:28.119415 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.119370 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.120073 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.120057 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.120156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.120089 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.120156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.120104 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.120156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.120131 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.126012 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.125998 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.126092 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.126018 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-230.ec2.internal\": node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.144332 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.144309 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.163067 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.163047 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal"] Apr 24 21:27:28.163137 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.163099 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.163719 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.163701 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.163798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.163728 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.163798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.163737 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.164739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.164727 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.164817 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.164792 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.164873 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.164819 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.165367 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.165351 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.165448 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.165356 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.165448 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.165392 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.165448 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.165401 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.165448 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.165376 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.165588 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.165463 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.166428 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.166413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.166485 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.166444 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.167085 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.167068 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.167152 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.167102 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.167152 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.167119 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.193644 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.193621 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-230.ec2.internal\" not found" node="ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.197888 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.197868 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-230.ec2.internal\" not found" node="ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.245159 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.245142 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.265571 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.265551 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b9e08fa0509c3abd65bcc96428aa523-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal\" (UID: \"8b9e08fa0509c3abd65bcc96428aa523\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.265648 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.265578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7f83006f9b8c8c12eeba4e850f965db6-config\") pod \"kube-apiserver-proxy-ip-10-0-129-230.ec2.internal\" (UID: \"7f83006f9b8c8c12eeba4e850f965db6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.265648 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.265593 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8b9e08fa0509c3abd65bcc96428aa523-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal\" (UID: \"8b9e08fa0509c3abd65bcc96428aa523\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.346097 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.346070 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.366437 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.366416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7f83006f9b8c8c12eeba4e850f965db6-config\") pod \"kube-apiserver-proxy-ip-10-0-129-230.ec2.internal\" (UID: \"7f83006f9b8c8c12eeba4e850f965db6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.366499 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.366486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7f83006f9b8c8c12eeba4e850f965db6-config\") pod \"kube-apiserver-proxy-ip-10-0-129-230.ec2.internal\" (UID: \"7f83006f9b8c8c12eeba4e850f965db6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.366535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.366501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8b9e08fa0509c3abd65bcc96428aa523-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal\" (UID: \"8b9e08fa0509c3abd65bcc96428aa523\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.366535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.366521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b9e08fa0509c3abd65bcc96428aa523-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal\" (UID: \"8b9e08fa0509c3abd65bcc96428aa523\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.366591 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.366540 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8b9e08fa0509c3abd65bcc96428aa523-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal\" (UID: \"8b9e08fa0509c3abd65bcc96428aa523\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.366591 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.366554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b9e08fa0509c3abd65bcc96428aa523-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal\" (UID: \"8b9e08fa0509c3abd65bcc96428aa523\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.446837 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.446815 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.495345 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.495324 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.499754 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.499738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:28.547389 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.547369 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.647871 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.647853 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.748302 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.748255 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.848723 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.848695 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.877152 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.877130 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:28.877642 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.877248 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:28.949663 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:28.949636 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:28.962323 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.962209 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:28.966506 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:28.966479 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9e08fa0509c3abd65bcc96428aa523.slice/crio-761e342c1ec53f67039437f4b6a3feeb93c5173792fea7331847d6585a13c78a WatchSource:0}: Error finding container 761e342c1ec53f67039437f4b6a3feeb93c5173792fea7331847d6585a13c78a: Status 404 returned error can't find the container with id 761e342c1ec53f67039437f4b6a3feeb93c5173792fea7331847d6585a13c78a Apr 24 21:27:28.967171 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:28.967155 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f83006f9b8c8c12eeba4e850f965db6.slice/crio-28b7b3b429f5ac8d3cfa01b83beb8ae776012693f3c595c29d9e13621918045f WatchSource:0}: Error finding container 28b7b3b429f5ac8d3cfa01b83beb8ae776012693f3c595c29d9e13621918045f: Status 404 returned error can't find the container with id 28b7b3b429f5ac8d3cfa01b83beb8ae776012693f3c595c29d9e13621918045f Apr 24 21:27:28.971815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.971801 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:28.975487 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:28.975454 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:29.015176 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.015109 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:28 +0000 UTC" deadline="2027-11-26 01:37:36.14011162 +0000 UTC" Apr 24 21:27:29.015176 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.015146 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13924h10m7.124968624s" Apr 24 21:27:29.050425 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:29.050405 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-230.ec2.internal\" not found" Apr 24 21:27:29.054744 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.054731 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zzrmp" Apr 24 21:27:29.063291 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.063276 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zzrmp" Apr 24 21:27:29.065038 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.064991 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" event={"ID":"7f83006f9b8c8c12eeba4e850f965db6","Type":"ContainerStarted","Data":"28b7b3b429f5ac8d3cfa01b83beb8ae776012693f3c595c29d9e13621918045f"} Apr 24 21:27:29.065868 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.065851 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" event={"ID":"8b9e08fa0509c3abd65bcc96428aa523","Type":"ContainerStarted","Data":"761e342c1ec53f67039437f4b6a3feeb93c5173792fea7331847d6585a13c78a"} Apr 24 21:27:29.104302 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.104282 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:29.134775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.134757 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:29.165014 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.164997 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" Apr 24 21:27:29.177945 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.177929 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:29.178669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.178656 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" Apr 24 21:27:29.186090 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.186074 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:29.369993 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.369924 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:29.941923 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.941896 2570 apiserver.go:52] "Watching apiserver" Apr 24 21:27:29.948110 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.948083 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:29.949669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.949634 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-74945","openshift-network-operator/iptables-alerter-vmvcc","kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal","openshift-dns/node-resolver-6lvzd","openshift-image-registry/node-ca-dxhb2","openshift-multus/multus-gpzd8","openshift-multus/network-metrics-daemon-892qf","openshift-network-diagnostics/network-check-target-bjr9v","openshift-ovn-kubernetes/ovnkube-node-6c6jh","kube-system/konnectivity-agent-f5p8z","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh","openshift-cluster-node-tuning-operator/tuned-vrsl9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal"] Apr 24 21:27:29.952295 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.952269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:29.952383 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:29.952345 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:29.956696 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.956350 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:29.956696 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.956444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:29.959220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.958519 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:29.959220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.958686 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.959220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.959059 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-285zj\"" Apr 24 21:27:29.959220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.959093 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.959220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.959120 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.959220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.959062 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.959551 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.959400 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:29.959637 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.959623 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2vjjw\"" Apr 24 21:27:29.960516 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.960499 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.961045 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.960739 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.961045 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.960969 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-z47zn\"" Apr 24 21:27:29.961455 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.961436 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.961989 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.961972 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:29.962767 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.962749 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.965217 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.964843 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:29.965217 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.964848 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:29.965217 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.964996 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:29.965217 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.965067 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.965217 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.965068 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.965217 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.965070 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:29.965522 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.965297 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pdptk\"" Apr 24 21:27:29.965522 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.965356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6n7r\"" Apr 24 21:27:29.967108 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.967087 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:29.967202 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:29.967150 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:29.967202 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.967193 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:29.969549 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.969510 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:29.969706 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.969691 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:29.969795 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.969694 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-q8d98\"" Apr 24 21:27:29.970293 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.970007 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.970293 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.970086 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:29.970293 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.970138 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.970537 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.970387 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:29.972611 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.972593 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:29.972712 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.972692 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:29.975095 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.974889 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:29.975095 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.974967 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.975095 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.974983 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bwdgz\"" Apr 24 21:27:29.975095 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.975073 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wv2wf\"" Apr 24 21:27:29.975363 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.975269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:29.975423 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.975366 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:29.975593 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.975576 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:29.975824 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.975807 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:29.976077 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:29.976141 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976097 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:29.976141 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976135 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.976233 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-k8s-cni-cncf-io\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976233 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-kubelet\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976310 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976247 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tctg\" (UniqueName: \"kubernetes.io/projected/2331d294-90e6-4527-bfaa-8f3913c788e1-kube-api-access-6tctg\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:29.976310 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976280 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-cnibin\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.976386 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-os-release\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.976386 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-cnibin\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976386 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-system-cni-dir\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.976498 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976396 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6db0669e-da93-4120-888c-ab35559e48f8-cni-binary-copy\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976498 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976420 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-iptables-alerter-script\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:29.976498 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbsx\" (UniqueName: \"kubernetes.io/projected/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-kube-api-access-rfbsx\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:29.976498 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2331d294-90e6-4527-bfaa-8f3913c788e1-serviceca\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:29.976498 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e90ff02b-a473-4732-a712-a6377d84bf43-tmp-dir\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:29.976498 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976497 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-hostroot\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-host-slash\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976551 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e90ff02b-a473-4732-a712-a6377d84bf43-hosts-file\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976601 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-cni-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-socket-dir-parent\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976658 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-netns\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-conf-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6db0669e-da93-4120-888c-ab35559e48f8-multus-daemon-config\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.976781 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976792 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-cni-bin\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976816 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-multus-certs\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976833 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gkps\" (UniqueName: \"kubernetes.io/projected/6db0669e-da93-4120-888c-ab35559e48f8-kube-api-access-6gkps\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkrrq\" (UniqueName: \"kubernetes.io/projected/87439f5a-542b-48ed-980f-a2183de13b6f-kube-api-access-hkrrq\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wxs\" (UniqueName: \"kubernetes.io/projected/a30efbd8-e476-46d3-a06f-675f751559d5-kube-api-access-j5wxs\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsnxt\" (UniqueName: \"kubernetes.io/projected/e90ff02b-a473-4732-a712-a6377d84bf43-kube-api-access-nsnxt\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976919 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-etc-kubernetes\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2331d294-90e6-4527-bfaa-8f3913c788e1-host\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-cni-binary-copy\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.976989 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-system-cni-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.977057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-os-release\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.977086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-cni-multus\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:29.977652 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.977317 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:29.977713 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.977670 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z62gp\"" Apr 24 21:27:29.977788 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:29.977764 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.064181 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.064148 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:29 +0000 UTC" deadline="2027-12-04 14:08:04.693775715 +0000 UTC" Apr 24 21:27:30.064181 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.064179 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14128h40m34.629599858s" Apr 24 21:27:30.065635 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.065600 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:30.078047 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078006 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6db0669e-da93-4120-888c-ab35559e48f8-multus-daemon-config\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.078160 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078066 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkm5q\" (UniqueName: \"kubernetes.io/projected/7b219b45-d422-43ea-8238-d0f73e8f85e3-kube-api-access-nkm5q\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.078160 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-ovnkube-script-lib\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.078160 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078122 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gkps\" (UniqueName: \"kubernetes.io/projected/6db0669e-da93-4120-888c-ab35559e48f8-kube-api-access-6gkps\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.078160 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-var-lib-kubelet\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.078348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkrrq\" (UniqueName: \"kubernetes.io/projected/87439f5a-542b-48ed-980f-a2183de13b6f-kube-api-access-hkrrq\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:30.078348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.078752 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078264 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wxs\" (UniqueName: \"kubernetes.io/projected/a30efbd8-e476-46d3-a06f-675f751559d5-kube-api-access-j5wxs\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.078836 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.078836 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsnxt\" (UniqueName: \"kubernetes.io/projected/e90ff02b-a473-4732-a712-a6377d84bf43-kube-api-access-nsnxt\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.078836 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-etc-kubernetes\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-run\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.079010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-kubelet\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.079010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-etc-kubernetes\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-cni-binary-copy\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:30.079010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.078998 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6db0669e-da93-4120-888c-ab35559e48f8-multus-daemon-config\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-run-netns\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079137 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-etc-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b7be731-9db3-42b5-831e-c40b988f32aa-agent-certs\") pod \"konnectivity-agent-f5p8z\" (UID: \"1b7be731-9db3-42b5-831e-c40b988f32aa\") " pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-k8s-cni-cncf-io\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-kubelet\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.079276 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-sys\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.079312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079304 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-k8s-cni-cncf-io\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tctg\" (UniqueName: \"kubernetes.io/projected/2331d294-90e6-4527-bfaa-8f3913c788e1-kube-api-access-6tctg\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-cnibin\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.079370 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:30.579333068 +0000 UTC m=+3.002840470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-cnibin\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-kubelet\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-os-release\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-cni-binary-copy\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-cnibin\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysconfig\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079588 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-os-release\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079603 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-systemd\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079624 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-cnibin\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-ovn\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-system-cni-dir\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6db0669e-da93-4120-888c-ab35559e48f8-cni-binary-copy\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.079724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b7be731-9db3-42b5-831e-c40b988f32aa-konnectivity-ca\") pod \"konnectivity-agent-f5p8z\" (UID: \"1b7be731-9db3-42b5-831e-c40b988f32aa\") " pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-system-cni-dir\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-socket-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-iptables-alerter-script\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.079837 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbsx\" (UniqueName: \"kubernetes.io/projected/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-kube-api-access-rfbsx\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlxfp\" (UniqueName: \"kubernetes.io/projected/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-kube-api-access-rlxfp\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080112 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-cni-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080139 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-netns\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-conf-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080210 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtpx6\" (UniqueName: \"kubernetes.io/projected/7203859a-3465-4088-aaf1-c39a752936b3-kube-api-access-vtpx6\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-etc-selinux\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-netns\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080250 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6db0669e-da93-4120-888c-ab35559e48f8-cni-binary-copy\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080285 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-conf-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080304 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080208 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-cni-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-cni-bin\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080337 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-iptables-alerter-script\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-multus-certs\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-modprobe-d\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-run-multus-certs\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-systemd\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a30efbd8-e476-46d3-a06f-675f751559d5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080431 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-slash\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-cni-bin\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-env-overrides\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-kubernetes\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080496 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080514 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-registration-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080542 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2331d294-90e6-4527-bfaa-8f3913c788e1-host\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-system-cni-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-os-release\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.080854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080658 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-os-release\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2331d294-90e6-4527-bfaa-8f3913c788e1-host\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080662 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-system-cni-dir\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080686 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-cni-multus\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysctl-conf\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-host-var-lib-cni-multus\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080749 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-log-socket\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080794 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080823 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-host\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080863 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-lib-modules\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-cni-bin\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-cni-netd\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080925 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7203859a-3465-4088-aaf1-c39a752936b3-ovn-node-metrics-cert\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-device-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.080998 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-sys-fs\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.081532 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081054 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-tuned\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081072 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b219b45-d422-43ea-8238-d0f73e8f85e3-tmp\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-systemd-units\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081101 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2331d294-90e6-4527-bfaa-8f3913c788e1-serviceca\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081160 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e90ff02b-a473-4732-a712-a6377d84bf43-tmp-dir\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-hostroot\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a30efbd8-e476-46d3-a06f-675f751559d5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysctl-d\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-var-lib-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e90ff02b-a473-4732-a712-a6377d84bf43-tmp-dir\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081488 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-node-log\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-ovnkube-config\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-host-slash\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-hostroot\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e90ff02b-a473-4732-a712-a6377d84bf43-hosts-file\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-socket-dir-parent\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.082128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-host-slash\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.082676 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2331d294-90e6-4527-bfaa-8f3913c788e1-serviceca\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.082676 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081637 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e90ff02b-a473-4732-a712-a6377d84bf43-hosts-file\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.082676 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.081664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6db0669e-da93-4120-888c-ab35559e48f8-multus-socket-dir-parent\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.084741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.084714 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:30.085266 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.085248 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:30.085423 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.085272 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:30.085423 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.085284 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.085423 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.085325 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:30.585312021 +0000 UTC m=+3.008819405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.088235 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.088212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gkps\" (UniqueName: \"kubernetes.io/projected/6db0669e-da93-4120-888c-ab35559e48f8-kube-api-access-6gkps\") pod \"multus-gpzd8\" (UID: \"6db0669e-da93-4120-888c-ab35559e48f8\") " pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.088546 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.088374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsnxt\" (UniqueName: \"kubernetes.io/projected/e90ff02b-a473-4732-a712-a6377d84bf43-kube-api-access-nsnxt\") pod \"node-resolver-6lvzd\" (UID: \"e90ff02b-a473-4732-a712-a6377d84bf43\") " pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.088546 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.088522 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkrrq\" (UniqueName: \"kubernetes.io/projected/87439f5a-542b-48ed-980f-a2183de13b6f-kube-api-access-hkrrq\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:30.088737 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.088714 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wxs\" (UniqueName: \"kubernetes.io/projected/a30efbd8-e476-46d3-a06f-675f751559d5-kube-api-access-j5wxs\") pod \"multus-additional-cni-plugins-74945\" (UID: \"a30efbd8-e476-46d3-a06f-675f751559d5\") " pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.088911 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.088894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tctg\" (UniqueName: \"kubernetes.io/projected/2331d294-90e6-4527-bfaa-8f3913c788e1-kube-api-access-6tctg\") pod \"node-ca-dxhb2\" (UID: \"2331d294-90e6-4527-bfaa-8f3913c788e1\") " pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.091899 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.091850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbsx\" (UniqueName: \"kubernetes.io/projected/f3e83c89-0e34-4b6d-aa0b-98737298d3d7-kube-api-access-rfbsx\") pod \"iptables-alerter-vmvcc\" (UID: \"f3e83c89-0e34-4b6d-aa0b-98737298d3d7\") " pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.182212 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182178 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-run\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182212 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182224 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-kubelet\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-kubelet\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-run\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-run-netns\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-run-netns\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-etc-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b7be731-9db3-42b5-831e-c40b988f32aa-agent-certs\") pod \"konnectivity-agent-f5p8z\" (UID: \"1b7be731-9db3-42b5-831e-c40b988f32aa\") " pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.182456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-etc-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182487 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-sys\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysconfig\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182538 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-systemd\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-ovn\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-sys\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b7be731-9db3-42b5-831e-c40b988f32aa-konnectivity-ca\") pod \"konnectivity-agent-f5p8z\" (UID: \"1b7be731-9db3-42b5-831e-c40b988f32aa\") " pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-socket-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182620 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-systemd\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlxfp\" (UniqueName: \"kubernetes.io/projected/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-kube-api-access-rlxfp\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtpx6\" (UniqueName: \"kubernetes.io/projected/7203859a-3465-4088-aaf1-c39a752936b3-kube-api-access-vtpx6\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182666 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysconfig\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-etc-selinux\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-modprobe-d\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.182775 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-systemd\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182775 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-slash\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-env-overrides\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-kubernetes\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182854 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-ovn\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-registration-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysctl-conf\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182950 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-etc-selinux\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182966 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-log-socket\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.182996 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-socket-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-host\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-lib-modules\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-cni-bin\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.183411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-modprobe-d\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-cni-netd\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183119 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-cni-netd\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-systemd\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-registration-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183166 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b7be731-9db3-42b5-831e-c40b988f32aa-konnectivity-ca\") pod \"konnectivity-agent-f5p8z\" (UID: \"1b7be731-9db3-42b5-831e-c40b988f32aa\") " pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183246 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysctl-conf\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183237 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7203859a-3465-4088-aaf1-c39a752936b3-ovn-node-metrics-cert\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183269 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183286 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-host\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-run-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-log-socket\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-kubernetes\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-cni-bin\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183430 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-slash\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183458 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-device-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183459 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-lib-modules\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-sys-fs\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.184180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-tuned\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b219b45-d422-43ea-8238-d0f73e8f85e3-tmp\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183516 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-device-dir\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183531 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-systemd-units\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183556 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183569 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-sys-fs\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183578 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-env-overrides\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysctl-d\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-systemd-units\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-var-lib-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-node-log\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183686 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-sysctl-d\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-var-lib-openvswitch\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183696 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7203859a-3465-4088-aaf1-c39a752936b3-node-log\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-ovnkube-config\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkm5q\" (UniqueName: \"kubernetes.io/projected/7b219b45-d422-43ea-8238-d0f73e8f85e3-kube-api-access-nkm5q\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.185206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-ovnkube-script-lib\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.186014 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-var-lib-kubelet\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.186014 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.183878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b219b45-d422-43ea-8238-d0f73e8f85e3-var-lib-kubelet\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.186014 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.184188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-ovnkube-config\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.186014 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.184427 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7203859a-3465-4088-aaf1-c39a752936b3-ovnkube-script-lib\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.186014 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.185309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b7be731-9db3-42b5-831e-c40b988f32aa-agent-certs\") pod \"konnectivity-agent-f5p8z\" (UID: \"1b7be731-9db3-42b5-831e-c40b988f32aa\") " pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.186261 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.186098 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7b219b45-d422-43ea-8238-d0f73e8f85e3-etc-tuned\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.186488 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.186469 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7203859a-3465-4088-aaf1-c39a752936b3-ovn-node-metrics-cert\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.186552 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.186472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7b219b45-d422-43ea-8238-d0f73e8f85e3-tmp\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.192372 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.192272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkm5q\" (UniqueName: \"kubernetes.io/projected/7b219b45-d422-43ea-8238-d0f73e8f85e3-kube-api-access-nkm5q\") pod \"tuned-vrsl9\" (UID: \"7b219b45-d422-43ea-8238-d0f73e8f85e3\") " pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.192372 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.192282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtpx6\" (UniqueName: \"kubernetes.io/projected/7203859a-3465-4088-aaf1-c39a752936b3-kube-api-access-vtpx6\") pod \"ovnkube-node-6c6jh\" (UID: \"7203859a-3465-4088-aaf1-c39a752936b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.192372 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.192322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlxfp\" (UniqueName: \"kubernetes.io/projected/f6fa6e75-2769-408c-a21d-d05ee9ab8ea3-kube-api-access-rlxfp\") pod \"aws-ebs-csi-driver-node-xrdbh\" (UID: \"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.269073 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.269048 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vmvcc" Apr 24 21:27:30.276684 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.276667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6lvzd" Apr 24 21:27:30.286380 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.286363 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dxhb2" Apr 24 21:27:30.290943 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.290926 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gpzd8" Apr 24 21:27:30.296363 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.296338 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-74945" Apr 24 21:27:30.303911 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.303895 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:30.309441 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.309421 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:30.314912 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.314894 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" Apr 24 21:27:30.319467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.319450 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" Apr 24 21:27:30.406594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.406569 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:30.586612 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.586588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:30.586715 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:30.586622 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:30.586785 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.586725 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.586785 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.586738 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:30.586785 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.586753 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:30.586785 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.586764 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.586785 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.586787 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.586767502 +0000 UTC m=+4.010274903 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.587069 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:30.586805 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.586793931 +0000 UTC m=+4.010301321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.597654 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.597633 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3e83c89_0e34_4b6d_aa0b_98737298d3d7.slice/crio-4019956e45f13842dc440915f95bb503ce08645de1fdaca849414f2124af2112 WatchSource:0}: Error finding container 4019956e45f13842dc440915f95bb503ce08645de1fdaca849414f2124af2112: Status 404 returned error can't find the container with id 4019956e45f13842dc440915f95bb503ce08645de1fdaca849414f2124af2112 Apr 24 21:27:30.598811 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.598789 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6db0669e_da93_4120_888c_ab35559e48f8.slice/crio-5c58507fb53a6e99a99da7f3fe07b3442f9cc2ebe6de8422702aea9b1e69d3b9 WatchSource:0}: Error finding container 5c58507fb53a6e99a99da7f3fe07b3442f9cc2ebe6de8422702aea9b1e69d3b9: Status 404 returned error can't find the container with id 5c58507fb53a6e99a99da7f3fe07b3442f9cc2ebe6de8422702aea9b1e69d3b9 Apr 24 21:27:30.599813 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.599790 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30efbd8_e476_46d3_a06f_675f751559d5.slice/crio-0a914ba5c19b95f345982300e8c17352533d3713406403a87e95e26e0125be35 WatchSource:0}: Error finding container 0a914ba5c19b95f345982300e8c17352533d3713406403a87e95e26e0125be35: Status 404 returned error can't find the container with id 0a914ba5c19b95f345982300e8c17352533d3713406403a87e95e26e0125be35 Apr 24 21:27:30.600509 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.600478 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2331d294_90e6_4527_bfaa_8f3913c788e1.slice/crio-dc3114e9802c77b4f65c5be9f9892a1330e10cf88e15c90c13de10709e0eae27 WatchSource:0}: Error finding container dc3114e9802c77b4f65c5be9f9892a1330e10cf88e15c90c13de10709e0eae27: Status 404 returned error can't find the container with id dc3114e9802c77b4f65c5be9f9892a1330e10cf88e15c90c13de10709e0eae27 Apr 24 21:27:30.601266 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.601175 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7203859a_3465_4088_aaf1_c39a752936b3.slice/crio-8f04418d3a1a86c3ce0cd321805507f1a939adc79dbfd540778941d24b75d088 WatchSource:0}: Error finding container 8f04418d3a1a86c3ce0cd321805507f1a939adc79dbfd540778941d24b75d088: Status 404 returned error can't find the container with id 8f04418d3a1a86c3ce0cd321805507f1a939adc79dbfd540778941d24b75d088 Apr 24 21:27:30.607632 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.607608 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b219b45_d422_43ea_8238_d0f73e8f85e3.slice/crio-04fffd7bf45420379740cd2482024b2f96c9566a398ddbcb0e12ab287a7df493 WatchSource:0}: Error finding container 04fffd7bf45420379740cd2482024b2f96c9566a398ddbcb0e12ab287a7df493: Status 404 returned error can't find the container with id 04fffd7bf45420379740cd2482024b2f96c9566a398ddbcb0e12ab287a7df493 Apr 24 21:27:30.609224 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:27:30.608996 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6fa6e75_2769_408c_a21d_d05ee9ab8ea3.slice/crio-1934ecb8aa2b625948742164d5555b254967a94c6c156993aa98ce3c3a981a46 WatchSource:0}: Error finding container 1934ecb8aa2b625948742164d5555b254967a94c6c156993aa98ce3c3a981a46: Status 404 returned error can't find the container with id 1934ecb8aa2b625948742164d5555b254967a94c6c156993aa98ce3c3a981a46 Apr 24 21:27:31.064554 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.064512 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:29 +0000 UTC" deadline="2027-11-18 22:40:50.535599884 +0000 UTC" Apr 24 21:27:31.064554 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.064551 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13753h13m19.471052043s" Apr 24 21:27:31.073732 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.073699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" event={"ID":"7f83006f9b8c8c12eeba4e850f965db6","Type":"ContainerStarted","Data":"c0c763c139b40e68a9c1bccf833cc72f1d2dcc803cc9f4cf084f9b3cc56bdfa2"} Apr 24 21:27:31.079757 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.079720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" event={"ID":"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3","Type":"ContainerStarted","Data":"1934ecb8aa2b625948742164d5555b254967a94c6c156993aa98ce3c3a981a46"} Apr 24 21:27:31.082095 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.082046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6lvzd" event={"ID":"e90ff02b-a473-4732-a712-a6377d84bf43","Type":"ContainerStarted","Data":"c9ec2e63d03f7dec080bc6ceac21da409d41e4792d25087d70b6391c0b60c0da"} Apr 24 21:27:31.083788 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.083694 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerStarted","Data":"0a914ba5c19b95f345982300e8c17352533d3713406403a87e95e26e0125be35"} Apr 24 21:27:31.091120 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.091062 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gpzd8" event={"ID":"6db0669e-da93-4120-888c-ab35559e48f8","Type":"ContainerStarted","Data":"5c58507fb53a6e99a99da7f3fe07b3442f9cc2ebe6de8422702aea9b1e69d3b9"} Apr 24 21:27:31.094039 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.093322 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-230.ec2.internal" podStartSLOduration=2.093307675 podStartE2EDuration="2.093307675s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:31.093172336 +0000 UTC m=+3.516679744" watchObservedRunningTime="2026-04-24 21:27:31.093307675 +0000 UTC m=+3.516815084" Apr 24 21:27:31.097590 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.097391 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" event={"ID":"7b219b45-d422-43ea-8238-d0f73e8f85e3","Type":"ContainerStarted","Data":"04fffd7bf45420379740cd2482024b2f96c9566a398ddbcb0e12ab287a7df493"} Apr 24 21:27:31.101563 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.101483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f5p8z" event={"ID":"1b7be731-9db3-42b5-831e-c40b988f32aa","Type":"ContainerStarted","Data":"af9ebed642c1956019eb769d8e4516dbb59cf9eb1f8d252b4463f8de49ef741c"} Apr 24 21:27:31.104844 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.104647 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"8f04418d3a1a86c3ce0cd321805507f1a939adc79dbfd540778941d24b75d088"} Apr 24 21:27:31.108830 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.108802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dxhb2" event={"ID":"2331d294-90e6-4527-bfaa-8f3913c788e1","Type":"ContainerStarted","Data":"dc3114e9802c77b4f65c5be9f9892a1330e10cf88e15c90c13de10709e0eae27"} Apr 24 21:27:31.115505 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.115478 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vmvcc" event={"ID":"f3e83c89-0e34-4b6d-aa0b-98737298d3d7","Type":"ContainerStarted","Data":"4019956e45f13842dc440915f95bb503ce08645de1fdaca849414f2124af2112"} Apr 24 21:27:31.595256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.595217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:31.595429 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:31.595275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:31.595429 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:31.595397 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.595548 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:31.595459 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.595439267 +0000 UTC m=+6.018946656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.595904 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:31.595883 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:31.595971 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:31.595912 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:31.595971 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:31.595925 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.596107 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:31.595969 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:33.595954349 +0000 UTC m=+6.019461741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:32.063946 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:32.063437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:32.063946 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:32.063597 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:32.064444 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:32.064302 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:32.064444 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:32.064411 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:32.128876 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:32.128839 2570 generic.go:358] "Generic (PLEG): container finished" podID="8b9e08fa0509c3abd65bcc96428aa523" containerID="05412cd1d3d615010b6507b876c74ec3ef1a979610b432e81333dd7d8c3a3ae9" exitCode=0 Apr 24 21:27:32.129853 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:32.129804 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" event={"ID":"8b9e08fa0509c3abd65bcc96428aa523","Type":"ContainerDied","Data":"05412cd1d3d615010b6507b876c74ec3ef1a979610b432e81333dd7d8c3a3ae9"} Apr 24 21:27:33.134633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:33.134593 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" event={"ID":"8b9e08fa0509c3abd65bcc96428aa523","Type":"ContainerStarted","Data":"602c84bda90b181f6fb9b4fceafc611bfc95194c2ddf20cb5280fc43b010827a"} Apr 24 21:27:33.611199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:33.611156 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:33.611389 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:33.611211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:33.611389 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:33.611328 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:33.611389 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:33.611385 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.611368558 +0000 UTC m=+10.034875958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:33.611824 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:33.611803 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:33.611824 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:33.611823 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:33.612005 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:33.611836 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:33.612005 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:33.611879 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.611865606 +0000 UTC m=+10.035372992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:34.063215 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:34.063144 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:34.063215 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:34.063160 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:34.063392 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:34.063279 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:34.063573 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:34.063524 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:36.064234 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:36.064196 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:36.064701 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:36.064305 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:36.064701 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:36.064644 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:36.064820 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:36.064748 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:37.641793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:37.641755 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:37.641793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:37.641796 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:37.642424 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:37.641899 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:37.642424 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:37.641929 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:37.642424 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:37.641952 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:37.642424 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:37.641968 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:37.642424 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:37.641954 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.641937371 +0000 UTC m=+18.065444765 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:37.642424 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:37.642050 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:45.642033904 +0000 UTC m=+18.065541290 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:38.066697 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.066245 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:38.066697 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:38.066352 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:38.066697 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.066390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:38.066697 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:38.066474 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:38.809906 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.809794 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-230.ec2.internal" podStartSLOduration=9.809745657 podStartE2EDuration="9.809745657s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:33.1514561 +0000 UTC m=+5.574963508" watchObservedRunningTime="2026-04-24 21:27:38.809745657 +0000 UTC m=+11.233253088" Apr 24 21:27:38.810419 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.810116 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-2sfj6"] Apr 24 21:27:38.813422 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.813088 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.813422 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:38.813176 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:38.851741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.851711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.851857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.851812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb780578-f495-4afb-a817-59c177c8993a-kubelet-config\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.851857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.851843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb780578-f495-4afb-a817-59c177c8993a-dbus\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.952635 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.952607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb780578-f495-4afb-a817-59c177c8993a-kubelet-config\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.952635 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.952640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb780578-f495-4afb-a817-59c177c8993a-dbus\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.952824 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.952660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.952824 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.952738 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bb780578-f495-4afb-a817-59c177c8993a-kubelet-config\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.952824 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:38.952762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bb780578-f495-4afb-a817-59c177c8993a-dbus\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:38.952824 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:38.952773 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:38.952824 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:38.952822 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret podName:bb780578-f495-4afb-a817-59c177c8993a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:39.452808275 +0000 UTC m=+11.876315659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret") pod "global-pull-secret-syncer-2sfj6" (UID: "bb780578-f495-4afb-a817-59c177c8993a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:39.455253 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:39.455218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:39.455396 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:39.455340 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:39.455396 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:39.455390 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret podName:bb780578-f495-4afb-a817-59c177c8993a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.455372011 +0000 UTC m=+12.878879407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret") pod "global-pull-secret-syncer-2sfj6" (UID: "bb780578-f495-4afb-a817-59c177c8993a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:40.063212 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:40.063132 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:40.063653 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:40.063132 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:40.063653 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:40.063128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:40.063653 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:40.063384 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:40.063653 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:40.063252 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:40.063653 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:40.063475 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:40.463571 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:40.463511 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:40.463753 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:40.463668 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:40.463753 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:40.463736 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret podName:bb780578-f495-4afb-a817-59c177c8993a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:42.463720281 +0000 UTC m=+14.887227666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret") pod "global-pull-secret-syncer-2sfj6" (UID: "bb780578-f495-4afb-a817-59c177c8993a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:42.066256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:42.066218 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:42.066256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:42.066232 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:42.066765 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:42.066226 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:42.066765 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:42.066313 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:42.066765 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:42.066405 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:42.066765 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:42.066503 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:42.481679 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:42.481643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:42.481834 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:42.481786 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:42.481889 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:42.481842 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret podName:bb780578-f495-4afb-a817-59c177c8993a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:46.481828188 +0000 UTC m=+18.905335574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret") pod "global-pull-secret-syncer-2sfj6" (UID: "bb780578-f495-4afb-a817-59c177c8993a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:44.062861 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:44.062826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:44.063263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:44.062826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:44.063263 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:44.062945 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:44.063263 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:44.063073 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:44.063263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:44.062826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:44.063263 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:44.063145 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:45.708936 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:45.708899 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:45.708944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:45.709089 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:45.709090 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:45.709126 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:45.709141 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:45.709161 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.709142083 +0000 UTC m=+34.132649468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:45.709324 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:45.709182 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.709169455 +0000 UTC m=+34.132676854 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:46.065249 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:46.065182 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:46.065383 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:46.065181 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:46.065383 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:46.065274 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:46.065383 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:46.065181 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:46.065383 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:46.065351 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:46.065507 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:46.065446 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:46.514463 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:46.514428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:46.514636 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:46.514576 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:46.514679 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:46.514645 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret podName:bb780578-f495-4afb-a817-59c177c8993a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:54.514629064 +0000 UTC m=+26.938136455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret") pod "global-pull-secret-syncer-2sfj6" (UID: "bb780578-f495-4afb-a817-59c177c8993a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:48.064011 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.063715 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:48.064650 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.063747 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:48.064650 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:48.064119 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:48.064650 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.063770 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:48.064650 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:48.064201 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:48.064650 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:48.064311 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:48.164117 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.162961 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f5p8z" event={"ID":"1b7be731-9db3-42b5-831e-c40b988f32aa","Type":"ContainerStarted","Data":"a609e13426594c27fb06f6faef79caeef7c9938e0ffafc2c431dde11da88353c"} Apr 24 21:27:48.165942 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.165909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"e5ecf339d69cd593a02d61060e40a51ddda99f371a07f07d6feb400a8a900548"} Apr 24 21:27:48.166064 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.165953 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"20d6b91efe5de0ef10658b6601adabe67a3b9109dc99d4744bc8c05af4f5b43a"} Apr 24 21:27:48.167388 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.167358 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dxhb2" event={"ID":"2331d294-90e6-4527-bfaa-8f3913c788e1","Type":"ContainerStarted","Data":"e4ab062a9584333773e9424b2df7b4d6b45e8858f9a66da2aa30be46e91d7ad5"} Apr 24 21:27:48.168816 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.168791 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" event={"ID":"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3","Type":"ContainerStarted","Data":"307c85b9077e60122449b9fca96a0d72275ba077d7d471b665f7e85d751ca0fc"} Apr 24 21:27:48.170102 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.170079 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6lvzd" event={"ID":"e90ff02b-a473-4732-a712-a6377d84bf43","Type":"ContainerStarted","Data":"3d8544717b5157751660be4f6f6a6b70509b5b5daa256b94f126d25855b28381"} Apr 24 21:27:48.171550 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.171526 2570 generic.go:358] "Generic (PLEG): container finished" podID="a30efbd8-e476-46d3-a06f-675f751559d5" containerID="210280179e9713eb203a7dff0f20240849e5b6c9fb380fa9023f3449f1483d7f" exitCode=0 Apr 24 21:27:48.171640 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.171597 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerDied","Data":"210280179e9713eb203a7dff0f20240849e5b6c9fb380fa9023f3449f1483d7f"} Apr 24 21:27:48.172900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.172871 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gpzd8" event={"ID":"6db0669e-da93-4120-888c-ab35559e48f8","Type":"ContainerStarted","Data":"f4f113d2caa8219c549b3e697b939cedfa39b8237311c3ec1cd47c86b1cf4636"} Apr 24 21:27:48.174841 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.174820 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" event={"ID":"7b219b45-d422-43ea-8238-d0f73e8f85e3","Type":"ContainerStarted","Data":"374c9a9c6bf42d6abec3f3715d015abad2c4463082137a1045ac49fdbfd195bd"} Apr 24 21:27:48.198585 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.198547 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-f5p8z" podStartSLOduration=11.021364301 podStartE2EDuration="20.198533615s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.609575397 +0000 UTC m=+3.033082787" lastFinishedPulling="2026-04-24 21:27:39.7867447 +0000 UTC m=+12.210252101" observedRunningTime="2026-04-24 21:27:48.18111258 +0000 UTC m=+20.604619987" watchObservedRunningTime="2026-04-24 21:27:48.198533615 +0000 UTC m=+20.622041022" Apr 24 21:27:48.199077 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.199041 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6lvzd" podStartSLOduration=3.220916854 podStartE2EDuration="20.199007614s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.604980871 +0000 UTC m=+3.028488256" lastFinishedPulling="2026-04-24 21:27:47.583071632 +0000 UTC m=+20.006579016" observedRunningTime="2026-04-24 21:27:48.198820018 +0000 UTC m=+20.622327406" watchObservedRunningTime="2026-04-24 21:27:48.199007614 +0000 UTC m=+20.622515022" Apr 24 21:27:48.210982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.210950 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dxhb2" podStartSLOduration=3.202977578 podStartE2EDuration="20.210939546s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.602412986 +0000 UTC m=+3.025920376" lastFinishedPulling="2026-04-24 21:27:47.610374937 +0000 UTC m=+20.033882344" observedRunningTime="2026-04-24 21:27:48.210759666 +0000 UTC m=+20.634267074" watchObservedRunningTime="2026-04-24 21:27:48.210939546 +0000 UTC m=+20.634446952" Apr 24 21:27:48.225822 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.225750 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gpzd8" podStartSLOduration=3.180206207 podStartE2EDuration="20.22573542s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.600292759 +0000 UTC m=+3.023800144" lastFinishedPulling="2026-04-24 21:27:47.645821971 +0000 UTC m=+20.069329357" observedRunningTime="2026-04-24 21:27:48.225145163 +0000 UTC m=+20.648652569" watchObservedRunningTime="2026-04-24 21:27:48.22573542 +0000 UTC m=+20.649242828" Apr 24 21:27:48.261288 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:48.261244 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vrsl9" podStartSLOduration=3.259998112 podStartE2EDuration="20.261230316s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.61083245 +0000 UTC m=+3.034339835" lastFinishedPulling="2026-04-24 21:27:47.612064639 +0000 UTC m=+20.035572039" observedRunningTime="2026-04-24 21:27:48.259886007 +0000 UTC m=+20.683393414" watchObservedRunningTime="2026-04-24 21:27:48.261230316 +0000 UTC m=+20.684737727" Apr 24 21:27:49.179194 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179006 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:27:49.179572 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179533 2570 generic.go:358] "Generic (PLEG): container finished" podID="7203859a-3465-4088-aaf1-c39a752936b3" containerID="e5ecf339d69cd593a02d61060e40a51ddda99f371a07f07d6feb400a8a900548" exitCode=1 Apr 24 21:27:49.179635 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerDied","Data":"e5ecf339d69cd593a02d61060e40a51ddda99f371a07f07d6feb400a8a900548"} Apr 24 21:27:49.179635 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179623 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"0bb4b324cd2a7b50840bbb9f5bf2237cc3fdec085ebbe55ea29ac1a23ac8b54c"} Apr 24 21:27:49.179739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179645 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"fcbcd276feabad2b85dcf3c76f45306a31bfe0eab2e2555ed6829373960e69bb"} Apr 24 21:27:49.179739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179655 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"4b3ad3467afb680098db3856c532650edacb09430b4d8a23de952d0ffa992fb8"} Apr 24 21:27:49.179739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.179668 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"10391da3fe3631c58ca5b1fe12378582fe8808d9c294abfaa1ea916cbd4c15f4"} Apr 24 21:27:49.181075 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.180955 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vmvcc" event={"ID":"f3e83c89-0e34-4b6d-aa0b-98737298d3d7","Type":"ContainerStarted","Data":"87eecb4a3d1212afd328062953a23bf145bbdf01b73b0ec0744d51dbec57a13b"} Apr 24 21:27:49.200696 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.200620 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vmvcc" podStartSLOduration=4.189892362 podStartE2EDuration="21.200604308s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.599624025 +0000 UTC m=+3.023131425" lastFinishedPulling="2026-04-24 21:27:47.610335973 +0000 UTC m=+20.033843371" observedRunningTime="2026-04-24 21:27:49.199937551 +0000 UTC m=+21.623444958" watchObservedRunningTime="2026-04-24 21:27:49.200604308 +0000 UTC m=+21.624111715" Apr 24 21:27:49.329416 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:49.329390 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:50.030039 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.029924 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:49.329412797Z","UUID":"604c13cd-c477-4aeb-a587-3fee071d60f2","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:50.033156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.033127 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:50.033156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.033160 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:50.062545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.062522 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:50.062696 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.062560 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:50.062696 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.062590 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:50.062813 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:50.062690 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:50.062977 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:50.062944 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:50.063112 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:50.063038 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:50.186209 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.186162 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" event={"ID":"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3","Type":"ContainerStarted","Data":"cbf6a1db393655f10c2782464ddab23cd15e04c5132d6b88ab0373a5ace8c426"} Apr 24 21:27:50.585879 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.585847 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:50.586378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:50.586358 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:51.187764 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:51.187721 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:51.188302 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:51.188259 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-f5p8z" Apr 24 21:27:52.062797 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:52.062756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:52.062797 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:52.062774 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:52.063004 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:52.062764 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:52.063004 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:52.062892 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:52.063103 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:52.062996 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:52.063145 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:52.063101 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:53.193491 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:53.193466 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:27:53.194054 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:53.193796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"71f98ab5311d326e093eeada1ed68f77c5bd4e0ecca0920ddc6674c48dd1b3dd"} Apr 24 21:27:53.195399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:53.195372 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" event={"ID":"f6fa6e75-2769-408c-a21d-d05ee9ab8ea3","Type":"ContainerStarted","Data":"8a0178214c2318fcf28411e3245e02f166f8927b4f69a7f684abc937c517cf1b"} Apr 24 21:27:53.196920 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:53.196900 2570 generic.go:358] "Generic (PLEG): container finished" podID="a30efbd8-e476-46d3-a06f-675f751559d5" containerID="d2b584be7044c65ed87f3c4a693f5f0f873c16318db51dce630f927f7b96916b" exitCode=0 Apr 24 21:27:53.197011 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:53.196988 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerDied","Data":"d2b584be7044c65ed87f3c4a693f5f0f873c16318db51dce630f927f7b96916b"} Apr 24 21:27:53.227418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:53.227377 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrdbh" podStartSLOduration=3.589956151 podStartE2EDuration="25.227365225s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.612372781 +0000 UTC m=+3.035880166" lastFinishedPulling="2026-04-24 21:27:52.249781855 +0000 UTC m=+24.673289240" observedRunningTime="2026-04-24 21:27:53.227169915 +0000 UTC m=+25.650677345" watchObservedRunningTime="2026-04-24 21:27:53.227365225 +0000 UTC m=+25.650872631" Apr 24 21:27:54.063069 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:54.062997 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:54.063069 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:54.063009 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:54.063069 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:54.063047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:54.063231 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:54.063125 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:54.063276 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:54.063248 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:54.063319 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:54.063304 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:54.200773 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:54.200746 2570 generic.go:358] "Generic (PLEG): container finished" podID="a30efbd8-e476-46d3-a06f-675f751559d5" containerID="202e5401a9fc532b71f66124f0d00dabd989a6c57eb012e2d3bc6c90c23449ea" exitCode=0 Apr 24 21:27:54.201145 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:54.200817 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerDied","Data":"202e5401a9fc532b71f66124f0d00dabd989a6c57eb012e2d3bc6c90c23449ea"} Apr 24 21:27:54.577124 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:54.576893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:54.577286 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:54.577058 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:54.577286 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:54.577257 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret podName:bb780578-f495-4afb-a817-59c177c8993a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:10.577239424 +0000 UTC m=+43.000746825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret") pod "global-pull-secret-syncer-2sfj6" (UID: "bb780578-f495-4afb-a817-59c177c8993a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:55.205034 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.205001 2570 generic.go:358] "Generic (PLEG): container finished" podID="a30efbd8-e476-46d3-a06f-675f751559d5" containerID="719fb2a90e2def4ff89b0b26c64f12c254fdbf6355a0f869c97ed12c840d644d" exitCode=0 Apr 24 21:27:55.205392 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.205122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerDied","Data":"719fb2a90e2def4ff89b0b26c64f12c254fdbf6355a0f869c97ed12c840d644d"} Apr 24 21:27:55.208300 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.208285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:27:55.208611 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.208591 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"c68371e50d7c69e8b02f9a0e19d15e7c037c6a38b31fa6e6360213e9247c1f5a"} Apr 24 21:27:55.208857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.208842 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:55.208900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.208866 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:55.209000 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.208985 2570 scope.go:117] "RemoveContainer" containerID="e5ecf339d69cd593a02d61060e40a51ddda99f371a07f07d6feb400a8a900548" Apr 24 21:27:55.224136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:55.224117 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:56.062773 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.062743 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:56.062955 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:56.062842 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:56.063287 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.063263 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:56.063426 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:56.063405 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:56.063487 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.063476 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:56.063546 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:56.063522 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:56.219305 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.219281 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:27:56.219708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.219681 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" event={"ID":"7203859a-3465-4088-aaf1-c39a752936b3","Type":"ContainerStarted","Data":"b79e4f63656edf23a8b9d7862f0d4a8beb22400eb0fd4046a20866afd971e844"} Apr 24 21:27:56.220136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.220078 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:56.245813 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.245668 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:27:56.268616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.268570 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" podStartSLOduration=10.957094914 podStartE2EDuration="28.268552816s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.603278723 +0000 UTC m=+3.026786108" lastFinishedPulling="2026-04-24 21:27:47.91473661 +0000 UTC m=+20.338244010" observedRunningTime="2026-04-24 21:27:56.26666821 +0000 UTC m=+28.690175618" watchObservedRunningTime="2026-04-24 21:27:56.268552816 +0000 UTC m=+28.692060224" Apr 24 21:27:56.441871 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.441843 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bjr9v"] Apr 24 21:27:56.442049 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.441955 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:56.442111 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:56.442078 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:27:56.447131 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.445734 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2sfj6"] Apr 24 21:27:56.447131 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.445811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:56.447131 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:56.445889 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:56.447131 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.446938 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-892qf"] Apr 24 21:27:56.447131 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:56.447008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:56.447131 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:56.447104 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:58.064041 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:58.064000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:27:58.064461 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:58.064000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:27:58.064461 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:58.064130 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:27:58.064461 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:58.064199 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:27:58.064461 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:27:58.064000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:27:58.064461 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:27:58.064257 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:28:00.063425 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.063389 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:28:00.063936 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.063389 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:28:00.063936 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.063505 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bjr9v" podUID="1a1d36cf-baab-4f24-a9d1-4dde21da6db3" Apr 24 21:28:00.063936 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.063576 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:28:00.063936 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.063389 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:28:00.063936 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.063669 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-2sfj6" podUID="bb780578-f495-4afb-a817-59c177c8993a" Apr 24 21:28:00.451668 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.451635 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-230.ec2.internal" event="NodeReady" Apr 24 21:28:00.451806 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.451775 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:00.491487 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.491456 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s"] Apr 24 21:28:00.523455 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.523399 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46"] Apr 24 21:28:00.523623 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.523529 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.530915 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.530891 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:28:00.532080 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.531481 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:28:00.532080 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.531725 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:28:00.535422 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.534957 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:28:00.542405 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.542385 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xm65x"] Apr 24 21:28:00.542539 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.542523 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.544837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.544814 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:28:00.545280 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.545256 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7jthv\"" Apr 24 21:28:00.563411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.563392 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76585dbdfb-6n5xh"] Apr 24 21:28:00.563557 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.563541 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:00.565595 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.565574 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:28:00.565902 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.565884 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bh4pt\"" Apr 24 21:28:00.566066 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.565998 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:28:00.570941 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.570924 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs"] Apr 24 21:28:00.571104 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.571080 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.573837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.573820 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:28:00.573951 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.573880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:28:00.574612 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.574592 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6rrzj\"" Apr 24 21:28:00.574686 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.574648 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:28:00.579466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.579445 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:28:00.595328 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595305 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s"] Apr 24 21:28:00.595409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595335 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xm65x"] Apr 24 21:28:00.595409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595351 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76585dbdfb-6n5xh"] Apr 24 21:28:00.595409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595364 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs"] Apr 24 21:28:00.595409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595374 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46"] Apr 24 21:28:00.595409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595388 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9nj6p"] Apr 24 21:28:00.595604 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.595409 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.598343 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.598312 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:28:00.598688 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.598668 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:28:00.598769 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.598706 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:28:00.598820 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.598676 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:28:00.613762 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.613743 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p4k7x"] Apr 24 21:28:00.613919 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.613905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.618349 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.618330 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:00.618434 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.618396 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5mvz7\"" Apr 24 21:28:00.618713 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.618695 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:00.622860 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.622841 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b75151dd-6829-4490-8047-359e80fcc2f9-tmp\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.622955 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.622873 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b75151dd-6829-4490-8047-359e80fcc2f9-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.622955 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.622897 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bhvv\" (UniqueName: \"kubernetes.io/projected/b75151dd-6829-4490-8047-359e80fcc2f9-kube-api-access-5bhvv\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.639175 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.639144 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9nj6p"] Apr 24 21:28:00.639175 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.639171 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p4k7x"] Apr 24 21:28:00.639301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.639286 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:00.641486 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.641461 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:00.641626 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.641604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-48pk8\"" Apr 24 21:28:00.641626 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.641621 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:00.641865 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.641846 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:00.723363 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.723518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723370 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:00.723518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723407 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-ca\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.723518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723432 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.723518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723490 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-image-registry-private-configuration\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.723518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723515 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkd52\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-kube-api-access-tkd52\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723536 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723551 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-registry-certificates\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723573 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/005374f7-a129-4b9a-aa9d-391fff615391-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d967f567f-7xx46\" (UID: \"005374f7-a129-4b9a-aa9d-391fff615391\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7gl\" (UniqueName: \"kubernetes.io/projected/5e907be1-1dd2-42ef-82cb-1889550f56df-kube-api-access-vn7gl\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-installation-pull-secrets\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723660 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-bound-sa-token\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.723741 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723729 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-hub\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.723982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-trusted-ca\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.723982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7czl\" (UniqueName: \"kubernetes.io/projected/005374f7-a129-4b9a-aa9d-391fff615391-kube-api-access-g7czl\") pod \"managed-serviceaccount-addon-agent-7d967f567f-7xx46\" (UID: \"005374f7-a129-4b9a-aa9d-391fff615391\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.723982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c465288-2f3f-4fc1-9192-76111e546363-tmp-dir\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.723982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723872 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/755a83ac-6e0b-4533-8c76-435876e1c64e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:00.723982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723898 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5e907be1-1dd2-42ef-82cb-1889550f56df-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.723982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723922 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/deb27629-83f5-41b3-b267-7644e8af713e-ca-trust-extracted\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.724229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.723984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.724229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.724011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x275c\" (UniqueName: \"kubernetes.io/projected/0c465288-2f3f-4fc1-9192-76111e546363-kube-api-access-x275c\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.724229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.724088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b75151dd-6829-4490-8047-359e80fcc2f9-tmp\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.724229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.724130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b75151dd-6829-4490-8047-359e80fcc2f9-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.724229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.724149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bhvv\" (UniqueName: \"kubernetes.io/projected/b75151dd-6829-4490-8047-359e80fcc2f9-kube-api-access-5bhvv\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.724229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.724166 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c465288-2f3f-4fc1-9192-76111e546363-config-volume\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.724672 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.724654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b75151dd-6829-4490-8047-359e80fcc2f9-tmp\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.728627 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.728604 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b75151dd-6829-4490-8047-359e80fcc2f9-klusterlet-config\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.733273 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.733255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bhvv\" (UniqueName: \"kubernetes.io/projected/b75151dd-6829-4490-8047-359e80fcc2f9-kube-api-access-5bhvv\") pod \"klusterlet-addon-workmgr-5d4fb7d7d5-czg9s\" (UID: \"b75151dd-6829-4490-8047-359e80fcc2f9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.825283 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.825283 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/005374f7-a129-4b9a-aa9d-391fff615391-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d967f567f-7xx46\" (UID: \"005374f7-a129-4b9a-aa9d-391fff615391\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.825283 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825273 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-image-registry-private-configuration\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.825467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-registry-certificates\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.825467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7gl\" (UniqueName: \"kubernetes.io/projected/5e907be1-1dd2-42ef-82cb-1889550f56df-kube-api-access-vn7gl\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.825467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-hub\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.825566 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-trusted-ca\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.825608 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825596 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/755a83ac-6e0b-4533-8c76-435876e1c64e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:00.825660 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.825729 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c465288-2f3f-4fc1-9192-76111e546363-config-volume\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.825729 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp545\" (UniqueName: \"kubernetes.io/projected/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-kube-api-access-vp545\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:00.825840 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:00.825840 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.825777 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:00.825840 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.825796 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:00.825992 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.825855 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.325834226 +0000 UTC m=+33.749341618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:00.826083 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-registry-certificates\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.826255 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.825782 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-ca\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.826352 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.826352 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkd52\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-kube-api-access-tkd52\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.826457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826358 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-installation-pull-secrets\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.826457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-bound-sa-token\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.826457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c465288-2f3f-4fc1-9192-76111e546363-tmp-dir\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.826457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7czl\" (UniqueName: \"kubernetes.io/projected/005374f7-a129-4b9a-aa9d-391fff615391-kube-api-access-g7czl\") pod \"managed-serviceaccount-addon-agent-7d967f567f-7xx46\" (UID: \"005374f7-a129-4b9a-aa9d-391fff615391\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.826656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5e907be1-1dd2-42ef-82cb-1889550f56df-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.826656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/deb27629-83f5-41b3-b267-7644e8af713e-ca-trust-extracted\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.826656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x275c\" (UniqueName: \"kubernetes.io/projected/0c465288-2f3f-4fc1-9192-76111e546363-kube-api-access-x275c\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.826656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826580 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.826656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/755a83ac-6e0b-4533-8c76-435876e1c64e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:00.826656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826626 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:00.826917 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.826692 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:00.826917 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826712 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-trusted-ca\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.826917 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.826750 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.326734986 +0000 UTC m=+33.750242395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:00.826917 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.826751 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c465288-2f3f-4fc1-9192-76111e546363-config-volume\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.827404 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.827384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c465288-2f3f-4fc1-9192-76111e546363-tmp-dir\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.827489 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.827405 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5e907be1-1dd2-42ef-82cb-1889550f56df-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.827669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.827643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/deb27629-83f5-41b3-b267-7644e8af713e-ca-trust-extracted\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.827755 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.827741 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:00.827821 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.827801 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.327785617 +0000 UTC m=+33.751293017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:00.828695 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.828669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-ca\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.828778 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.828761 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/005374f7-a129-4b9a-aa9d-391fff615391-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d967f567f-7xx46\" (UID: \"005374f7-a129-4b9a-aa9d-391fff615391\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.829257 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.829226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.829980 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.829957 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-hub\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.830159 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.830143 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5e907be1-1dd2-42ef-82cb-1889550f56df-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.834119 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.834099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-installation-pull-secrets\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.834119 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.834113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-image-registry-private-configuration\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.839423 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.838928 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:00.839627 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.839584 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-bound-sa-token\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.840654 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.840456 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7gl\" (UniqueName: \"kubernetes.io/projected/5e907be1-1dd2-42ef-82cb-1889550f56df-kube-api-access-vn7gl\") pod \"cluster-proxy-proxy-agent-59d5495f66-wbtqs\" (UID: \"5e907be1-1dd2-42ef-82cb-1889550f56df\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.840654 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.840482 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkd52\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-kube-api-access-tkd52\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:00.841745 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.841728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x275c\" (UniqueName: \"kubernetes.io/projected/0c465288-2f3f-4fc1-9192-76111e546363-kube-api-access-x275c\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:00.845276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.845241 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7czl\" (UniqueName: \"kubernetes.io/projected/005374f7-a129-4b9a-aa9d-391fff615391-kube-api-access-g7czl\") pod \"managed-serviceaccount-addon-agent-7d967f567f-7xx46\" (UID: \"005374f7-a129-4b9a-aa9d-391fff615391\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.860202 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.860119 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" Apr 24 21:28:00.906940 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.906909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:28:00.928599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.927874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:00.928599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.928198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp545\" (UniqueName: \"kubernetes.io/projected/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-kube-api-access-vp545\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:00.928599 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.928397 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:00.928599 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:00.928505 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.428483664 +0000 UTC m=+33.851991059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:00.944459 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:00.944433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp545\" (UniqueName: \"kubernetes.io/projected/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-kube-api-access-vp545\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:01.007101 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.007073 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s"] Apr 24 21:28:01.015170 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.015143 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46"] Apr 24 21:28:01.040772 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:28:01.040744 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75151dd_6829_4490_8047_359e80fcc2f9.slice/crio-64f89dc8b37f37d2d4808e38063266cfefba84a77493d6b6c2ce414103e20146 WatchSource:0}: Error finding container 64f89dc8b37f37d2d4808e38063266cfefba84a77493d6b6c2ce414103e20146: Status 404 returned error can't find the container with id 64f89dc8b37f37d2d4808e38063266cfefba84a77493d6b6c2ce414103e20146 Apr 24 21:28:01.041348 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:28:01.041329 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod005374f7_a129_4b9a_aa9d_391fff615391.slice/crio-a9659454c1fcf6ca70b6e0d116bb0cd68ef207b926c270a2c367932529bc16ca WatchSource:0}: Error finding container a9659454c1fcf6ca70b6e0d116bb0cd68ef207b926c270a2c367932529bc16ca: Status 404 returned error can't find the container with id a9659454c1fcf6ca70b6e0d116bb0cd68ef207b926c270a2c367932529bc16ca Apr 24 21:28:01.055679 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.055652 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs"] Apr 24 21:28:01.060295 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:28:01.060262 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e907be1_1dd2_42ef_82cb_1889550f56df.slice/crio-2ec9513ef2781f9df82305d77030b275b0ec859d17b861f76a0edb477fade80b WatchSource:0}: Error finding container 2ec9513ef2781f9df82305d77030b275b0ec859d17b861f76a0edb477fade80b: Status 404 returned error can't find the container with id 2ec9513ef2781f9df82305d77030b275b0ec859d17b861f76a0edb477fade80b Apr 24 21:28:01.232508 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.232475 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" event={"ID":"b75151dd-6829-4490-8047-359e80fcc2f9","Type":"ContainerStarted","Data":"64f89dc8b37f37d2d4808e38063266cfefba84a77493d6b6c2ce414103e20146"} Apr 24 21:28:01.233265 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.233243 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" event={"ID":"5e907be1-1dd2-42ef-82cb-1889550f56df","Type":"ContainerStarted","Data":"2ec9513ef2781f9df82305d77030b275b0ec859d17b861f76a0edb477fade80b"} Apr 24 21:28:01.234132 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.234110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" event={"ID":"005374f7-a129-4b9a-aa9d-391fff615391","Type":"ContainerStarted","Data":"a9659454c1fcf6ca70b6e0d116bb0cd68ef207b926c270a2c367932529bc16ca"} Apr 24 21:28:01.332623 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.332536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:01.332796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.332640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:01.332796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.332679 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:01.332796 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332691 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:01.332796 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332755 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.332740219 +0000 UTC m=+34.756247603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:01.332796 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332774 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:01.332796 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332789 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:01.333011 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332840 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.332828652 +0000 UTC m=+34.756336037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:01.333011 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332791 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:01.333011 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.332871 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.332863356 +0000 UTC m=+34.756370740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:01.433065 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.433014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:01.433189 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.433168 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:01.433238 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.433228 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.433213948 +0000 UTC m=+34.856721332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.736122 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:01.736177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.736403 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.736460 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.736442533 +0000 UTC m=+66.159949921 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.736798 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.736812 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.736822 2570 projected.go:194] Error preparing data for projected volume kube-api-access-wpxms for pod openshift-network-diagnostics/network-check-target-bjr9v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:01.740999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:01.736865 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms podName:1a1d36cf-baab-4f24-a9d1-4dde21da6db3 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.736851178 +0000 UTC m=+66.160358564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wpxms" (UniqueName: "kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms") pod "network-check-target-bjr9v" (UID: "1a1d36cf-baab-4f24-a9d1-4dde21da6db3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:02.065288 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.065207 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:28:02.065288 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.065245 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:28:02.065512 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.065352 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:28:02.070875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.069396 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sctkp\"" Apr 24 21:28:02.070875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.069735 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:02.070875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.069914 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:02.070875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.070123 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:02.070875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.070442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgr5d\"" Apr 24 21:28:02.070875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.070654 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:02.244575 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.244538 2570 generic.go:358] "Generic (PLEG): container finished" podID="a30efbd8-e476-46d3-a06f-675f751559d5" containerID="7bf362879d9a88fa2e5227fa477f49d851e21dd0fbab9a44677dce3e30019f9b" exitCode=0 Apr 24 21:28:02.245045 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.244586 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerDied","Data":"7bf362879d9a88fa2e5227fa477f49d851e21dd0fbab9a44677dce3e30019f9b"} Apr 24 21:28:02.340951 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.340879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:02.340951 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.340934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:02.341168 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.341039 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:02.341228 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.341175 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:02.341275 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.341237 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.341218156 +0000 UTC m=+36.764725558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:02.343264 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.342841 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:02.343264 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.342862 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:02.343264 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.342908 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.342892159 +0000 UTC m=+36.766399550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:02.343264 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.342981 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:02.343264 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.343039 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.343007356 +0000 UTC m=+36.766514747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:02.441839 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:02.441662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:02.441839 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.441775 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:02.441839 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:02.441831 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.441814401 +0000 UTC m=+36.865321787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:03.252801 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:03.252279 2570 generic.go:358] "Generic (PLEG): container finished" podID="a30efbd8-e476-46d3-a06f-675f751559d5" containerID="a3d252b9fc5747e714b7ad7a7ea648bac6cfee4421e7ed41fb3201e86af329cb" exitCode=0 Apr 24 21:28:03.252801 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:03.252361 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerDied","Data":"a3d252b9fc5747e714b7ad7a7ea648bac6cfee4421e7ed41fb3201e86af329cb"} Apr 24 21:28:04.360093 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:04.359989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:04.360104 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:04.360142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360185 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360236 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360250 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360261 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.360240428 +0000 UTC m=+40.783747815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360288 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.36027635 +0000 UTC m=+40.783783756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360290 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:04.360485 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.360357 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.360340603 +0000 UTC m=+40.783848011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:04.461243 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:04.461211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:04.461374 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.461355 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:04.461423 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:04.461417 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.461402876 +0000 UTC m=+40.884910261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:07.262567 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.262365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-74945" event={"ID":"a30efbd8-e476-46d3-a06f-675f751559d5","Type":"ContainerStarted","Data":"73e6956d5b2917dea97680bb55a4219bbbacf4e4e47c3af84109a2585d39c29f"} Apr 24 21:28:07.264406 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.264367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" event={"ID":"b75151dd-6829-4490-8047-359e80fcc2f9","Type":"ContainerStarted","Data":"90b8a08a8aa9e58cab22b9871ff49cee28921edb8b93cbe61c2d62b561df8565"} Apr 24 21:28:07.264699 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.264555 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:07.266080 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.266057 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" event={"ID":"5e907be1-1dd2-42ef-82cb-1889550f56df","Type":"ContainerStarted","Data":"f99a3fbda2b6d41d464fa04ea2f48c50ccf870fe1948193c2a51a35665dc1614"} Apr 24 21:28:07.266196 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.266157 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:28:07.267456 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.267436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" event={"ID":"005374f7-a129-4b9a-aa9d-391fff615391","Type":"ContainerStarted","Data":"db43f5fe95a1feba31fac1d37713a391a8163c730ee6363a9a8868ad37013bc4"} Apr 24 21:28:07.298582 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.298541 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-74945" podStartSLOduration=8.820770359 podStartE2EDuration="39.298529399s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:30.603286172 +0000 UTC m=+3.026793566" lastFinishedPulling="2026-04-24 21:28:01.081045222 +0000 UTC m=+33.504552606" observedRunningTime="2026-04-24 21:28:07.297989074 +0000 UTC m=+39.721496488" watchObservedRunningTime="2026-04-24 21:28:07.298529399 +0000 UTC m=+39.722036806" Apr 24 21:28:07.319784 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.319727 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" podStartSLOduration=23.301964071 podStartE2EDuration="29.319716586s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.055980825 +0000 UTC m=+33.479488210" lastFinishedPulling="2026-04-24 21:28:07.073733339 +0000 UTC m=+39.497240725" observedRunningTime="2026-04-24 21:28:07.318862204 +0000 UTC m=+39.742369612" watchObservedRunningTime="2026-04-24 21:28:07.319716586 +0000 UTC m=+39.743223993" Apr 24 21:28:07.348380 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:07.348313 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" podStartSLOduration=23.347560143 podStartE2EDuration="29.348301539s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.056254155 +0000 UTC m=+33.479761541" lastFinishedPulling="2026-04-24 21:28:07.056995538 +0000 UTC m=+39.480502937" observedRunningTime="2026-04-24 21:28:07.347891037 +0000 UTC m=+39.771398443" watchObservedRunningTime="2026-04-24 21:28:07.348301539 +0000 UTC m=+39.771808943" Apr 24 21:28:08.397084 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:08.397040 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:08.397138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:08.397172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397211 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397272 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397277 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397292 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.397268506 +0000 UTC m=+48.820775905 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397293 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397321 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.397305201 +0000 UTC m=+48.820812589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:08.397544 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.397343 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.397331759 +0000 UTC m=+48.820839144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:08.498365 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:08.498334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:08.498510 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.498472 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:08.498565 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:08.498536 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.498521529 +0000 UTC m=+48.922028914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:10.274793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.274748 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" event={"ID":"5e907be1-1dd2-42ef-82cb-1889550f56df","Type":"ContainerStarted","Data":"03406f49ea7689b96544e05c65b605281636bfbbadb59fdd95a65a33d20fc0a0"} Apr 24 21:28:10.274793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.274786 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" event={"ID":"5e907be1-1dd2-42ef-82cb-1889550f56df","Type":"ContainerStarted","Data":"a6749405715cede22ecf40db5ed01408ae097561edd75e4ade2be6f0df75bdfd"} Apr 24 21:28:10.294632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.294575 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" podStartSLOduration=23.013978435 podStartE2EDuration="31.294560009s" podCreationTimestamp="2026-04-24 21:27:39 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.06236254 +0000 UTC m=+33.485869925" lastFinishedPulling="2026-04-24 21:28:09.342944111 +0000 UTC m=+41.766451499" observedRunningTime="2026-04-24 21:28:10.293602908 +0000 UTC m=+42.717110315" watchObservedRunningTime="2026-04-24 21:28:10.294560009 +0000 UTC m=+42.718067417" Apr 24 21:28:10.615961 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.615889 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:28:10.619514 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.619484 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bb780578-f495-4afb-a817-59c177c8993a-original-pull-secret\") pod \"global-pull-secret-syncer-2sfj6\" (UID: \"bb780578-f495-4afb-a817-59c177c8993a\") " pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:28:10.806348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.806308 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-2sfj6" Apr 24 21:28:10.917101 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:10.917065 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-2sfj6"] Apr 24 21:28:10.920558 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:28:10.920530 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb780578_f495_4afb_a817_59c177c8993a.slice/crio-c93ec2c0f5aba9782c486fe84d2afb09639dec524359988709d648edc64d8a79 WatchSource:0}: Error finding container c93ec2c0f5aba9782c486fe84d2afb09639dec524359988709d648edc64d8a79: Status 404 returned error can't find the container with id c93ec2c0f5aba9782c486fe84d2afb09639dec524359988709d648edc64d8a79 Apr 24 21:28:11.277310 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:11.277275 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2sfj6" event={"ID":"bb780578-f495-4afb-a817-59c177c8993a","Type":"ContainerStarted","Data":"c93ec2c0f5aba9782c486fe84d2afb09639dec524359988709d648edc64d8a79"} Apr 24 21:28:15.289548 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:15.289514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-2sfj6" event={"ID":"bb780578-f495-4afb-a817-59c177c8993a","Type":"ContainerStarted","Data":"b996ac5d90615f2d64a5f25757d4252b9e4d00039ad0feb67d22bbe827ebb8b1"} Apr 24 21:28:15.314415 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:15.314376 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-2sfj6" podStartSLOduration=33.900404911 podStartE2EDuration="37.314363297s" podCreationTimestamp="2026-04-24 21:27:38 +0000 UTC" firstStartedPulling="2026-04-24 21:28:10.922402575 +0000 UTC m=+43.345909967" lastFinishedPulling="2026-04-24 21:28:14.336360965 +0000 UTC m=+46.759868353" observedRunningTime="2026-04-24 21:28:15.313741804 +0000 UTC m=+47.737249211" watchObservedRunningTime="2026-04-24 21:28:15.314363297 +0000 UTC m=+47.737870740" Apr 24 21:28:16.461959 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:16.461924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:16.462008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462069 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:16.462093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462126 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.462110856 +0000 UTC m=+64.885618241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462146 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462197 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462211 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462203 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.46218786 +0000 UTC m=+64.885695260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:16.462399 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.462264 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.462250069 +0000 UTC m=+64.885757462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:16.562609 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:16.562585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:16.562715 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.562704 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:16.562754 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:16.562748 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:32.562737805 +0000 UTC m=+64.986245190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:28.233760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:28.233729 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6c6jh" Apr 24 21:28:32.480180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:32.480143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:32.480184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:32.480235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480286 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480305 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480311 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480345 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480368 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.480352104 +0000 UTC m=+96.903859489 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480382 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.480375615 +0000 UTC m=+96.903883001 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:28:32.480567 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.480400 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.480386842 +0000 UTC m=+96.903894227 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:28:32.581249 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:32.581208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:28:32.581411 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.581346 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:32.581411 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:32.581404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:04.581390154 +0000 UTC m=+97.004897539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:28:33.792727 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.792671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:28:33.792727 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.792729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:28:33.795413 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.795391 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:33.795636 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.795619 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:33.803580 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:33.803562 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:33.803764 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:28:33.803615 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:37.803599142 +0000 UTC m=+130.227106528 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : secret "metrics-daemon-secret" not found Apr 24 21:28:33.806783 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.805673 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:33.816200 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.816179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpxms\" (UniqueName: \"kubernetes.io/projected/1a1d36cf-baab-4f24-a9d1-4dde21da6db3-kube-api-access-wpxms\") pod \"network-check-target-bjr9v\" (UID: \"1a1d36cf-baab-4f24-a9d1-4dde21da6db3\") " pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:28:33.897352 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.897319 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sctkp\"" Apr 24 21:28:33.905751 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:33.905726 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:28:34.021989 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:34.021958 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bjr9v"] Apr 24 21:28:34.024714 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:28:34.024681 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a1d36cf_baab_4f24_a9d1_4dde21da6db3.slice/crio-7409f49db03e3d02c4021713715d6a4448838c9446ce3fecc0bf16f127f2ba92 WatchSource:0}: Error finding container 7409f49db03e3d02c4021713715d6a4448838c9446ce3fecc0bf16f127f2ba92: Status 404 returned error can't find the container with id 7409f49db03e3d02c4021713715d6a4448838c9446ce3fecc0bf16f127f2ba92 Apr 24 21:28:34.336976 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:34.336934 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bjr9v" event={"ID":"1a1d36cf-baab-4f24-a9d1-4dde21da6db3","Type":"ContainerStarted","Data":"7409f49db03e3d02c4021713715d6a4448838c9446ce3fecc0bf16f127f2ba92"} Apr 24 21:28:37.347272 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:37.347188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bjr9v" event={"ID":"1a1d36cf-baab-4f24-a9d1-4dde21da6db3","Type":"ContainerStarted","Data":"5c23b85f0e3ccd3d3eacabae27ec009009a28be814a42f7113ee0817eb76989b"} Apr 24 21:28:37.347623 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:28:37.347303 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:29:04.521593 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:04.521559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:04.521612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:04.521688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521699 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521707 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521752 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls podName:0c465288-2f3f-4fc1-9192-76111e546363 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:08.52173907 +0000 UTC m=+160.945246455 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls") pod "dns-default-9nj6p" (UID: "0c465288-2f3f-4fc1-9192-76111e546363") : secret "dns-default-metrics-tls" not found Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521767 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521779 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76585dbdfb-6n5xh: secret "image-registry-tls" not found Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521781 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert podName:755a83ac-6e0b-4533-8c76-435876e1c64e nodeName:}" failed. No retries permitted until 2026-04-24 21:30:08.52176262 +0000 UTC m=+160.945270020 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-xm65x" (UID: "755a83ac-6e0b-4533-8c76-435876e1c64e") : secret "networking-console-plugin-cert" not found Apr 24 21:29:04.522124 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.521814 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls podName:deb27629-83f5-41b3-b267-7644e8af713e nodeName:}" failed. No retries permitted until 2026-04-24 21:30:08.521805303 +0000 UTC m=+160.945312691 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls") pod "image-registry-76585dbdfb-6n5xh" (UID: "deb27629-83f5-41b3-b267-7644e8af713e") : secret "image-registry-tls" not found Apr 24 21:29:04.622333 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:04.622312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:29:04.622445 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.622387 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:04.622445 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:04.622436 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert podName:4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:08.622427102 +0000 UTC m=+161.045934486 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert") pod "ingress-canary-p4k7x" (UID: "4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5") : secret "canary-serving-cert" not found Apr 24 21:29:08.351873 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:08.351842 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bjr9v" Apr 24 21:29:08.369521 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:08.369476 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bjr9v" podStartSLOduration=97.315861291 podStartE2EDuration="1m40.369462006s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:34.02648827 +0000 UTC m=+66.449995656" lastFinishedPulling="2026-04-24 21:28:37.080088967 +0000 UTC m=+69.503596371" observedRunningTime="2026-04-24 21:28:37.390641337 +0000 UTC m=+69.814148744" watchObservedRunningTime="2026-04-24 21:29:08.369462006 +0000 UTC m=+100.792969412" Apr 24 21:29:37.847410 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:37.847377 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:29:37.847856 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:37.847515 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:37.847856 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:29:37.847597 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs podName:87439f5a-542b-48ed-980f-a2183de13b6f nodeName:}" failed. No retries permitted until 2026-04-24 21:31:39.847581302 +0000 UTC m=+252.271088687 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs") pod "network-metrics-daemon-892qf" (UID: "87439f5a-542b-48ed-980f-a2183de13b6f") : secret "metrics-daemon-secret" not found Apr 24 21:29:41.343730 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:41.343702 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6lvzd_e90ff02b-a473-4732-a712-a6377d84bf43/dns-node-resolver/0.log" Apr 24 21:29:41.945475 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:29:41.945450 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dxhb2_2331d294-90e6-4527-bfaa-8f3913c788e1/node-ca/0.log" Apr 24 21:30:03.573572 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:30:03.573537 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" podUID="755a83ac-6e0b-4533-8c76-435876e1c64e" Apr 24 21:30:03.581804 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:30:03.581783 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" podUID="deb27629-83f5-41b3-b267-7644e8af713e" Apr 24 21:30:03.627836 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:30:03.627811 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9nj6p" podUID="0c465288-2f3f-4fc1-9192-76111e546363" Apr 24 21:30:03.648987 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:30:03.648960 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-p4k7x" podUID="4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5" Apr 24 21:30:04.548667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:04.548640 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nj6p" Apr 24 21:30:04.548826 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:04.548686 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:04.548826 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:04.548704 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:30:05.084165 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:30:05.084133 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-892qf" podUID="87439f5a-542b-48ed-980f-a2183de13b6f" Apr 24 21:30:07.265814 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.265768 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" podUID="b75151dd-6829-4490-8047-359e80fcc2f9" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.6:8000/readyz\": dial tcp 10.133.0.6:8000: connect: connection refused" Apr 24 21:30:07.557125 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.557057 2570 generic.go:358] "Generic (PLEG): container finished" podID="005374f7-a129-4b9a-aa9d-391fff615391" containerID="db43f5fe95a1feba31fac1d37713a391a8163c730ee6363a9a8868ad37013bc4" exitCode=255 Apr 24 21:30:07.557248 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.557116 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" event={"ID":"005374f7-a129-4b9a-aa9d-391fff615391","Type":"ContainerDied","Data":"db43f5fe95a1feba31fac1d37713a391a8163c730ee6363a9a8868ad37013bc4"} Apr 24 21:30:07.557442 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.557425 2570 scope.go:117] "RemoveContainer" containerID="db43f5fe95a1feba31fac1d37713a391a8163c730ee6363a9a8868ad37013bc4" Apr 24 21:30:07.558576 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.558557 2570 generic.go:358] "Generic (PLEG): container finished" podID="b75151dd-6829-4490-8047-359e80fcc2f9" containerID="90b8a08a8aa9e58cab22b9871ff49cee28921edb8b93cbe61c2d62b561df8565" exitCode=1 Apr 24 21:30:07.558667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.558595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" event={"ID":"b75151dd-6829-4490-8047-359e80fcc2f9","Type":"ContainerDied","Data":"90b8a08a8aa9e58cab22b9871ff49cee28921edb8b93cbe61c2d62b561df8565"} Apr 24 21:30:07.558895 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:07.558880 2570 scope.go:117] "RemoveContainer" containerID="90b8a08a8aa9e58cab22b9871ff49cee28921edb8b93cbe61c2d62b561df8565" Apr 24 21:30:08.561505 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.561466 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:30:08.561876 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.561565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:08.561876 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.561608 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:30:08.564229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.564202 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0c465288-2f3f-4fc1-9192-76111e546363-metrics-tls\") pod \"dns-default-9nj6p\" (UID: \"0c465288-2f3f-4fc1-9192-76111e546363\") " pod="openshift-dns/dns-default-9nj6p" Apr 24 21:30:08.564229 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.564213 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d967f567f-7xx46" event={"ID":"005374f7-a129-4b9a-aa9d-391fff615391","Type":"ContainerStarted","Data":"2fdcac03845db2b46cf9fa0ea7e3650ad85ed03ae76e46e030d2abd11c4cee60"} Apr 24 21:30:08.564418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.564258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/755a83ac-6e0b-4533-8c76-435876e1c64e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xm65x\" (UID: \"755a83ac-6e0b-4533-8c76-435876e1c64e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:30:08.564418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.564369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"image-registry-76585dbdfb-6n5xh\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:08.565810 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.565787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" event={"ID":"b75151dd-6829-4490-8047-359e80fcc2f9","Type":"ContainerStarted","Data":"becbc19a6821a93d79b7cd0122671366821c5bdd1c3c3bd195812dc6f1d7e096"} Apr 24 21:30:08.566051 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.566016 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:30:08.566574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.566557 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5d4fb7d7d5-czg9s" Apr 24 21:30:08.661968 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.661937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:30:08.664064 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.664045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5-cert\") pod \"ingress-canary-p4k7x\" (UID: \"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5\") " pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:30:08.752476 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.752451 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5mvz7\"" Apr 24 21:30:08.752658 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.752451 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6rrzj\"" Apr 24 21:30:08.752718 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.752705 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bh4pt\"" Apr 24 21:30:08.760556 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.760538 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" Apr 24 21:30:08.760613 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.760569 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:08.760672 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.760658 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9nj6p" Apr 24 21:30:08.906557 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.906519 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76585dbdfb-6n5xh"] Apr 24 21:30:08.909154 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:30:08.909114 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeb27629_83f5_41b3_b267_7644e8af713e.slice/crio-7d7e1918252512a64fb4974b137efa75a566748648661daf40bc1d56be2b5d82 WatchSource:0}: Error finding container 7d7e1918252512a64fb4974b137efa75a566748648661daf40bc1d56be2b5d82: Status 404 returned error can't find the container with id 7d7e1918252512a64fb4974b137efa75a566748648661daf40bc1d56be2b5d82 Apr 24 21:30:08.917709 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.917684 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xm65x"] Apr 24 21:30:08.937351 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:08.937321 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9nj6p"] Apr 24 21:30:08.940384 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:30:08.940356 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c465288_2f3f_4fc1_9192_76111e546363.slice/crio-57495a519692f90d3973d547cef617f42a3df63bc9fa34eed5573cf1120bde28 WatchSource:0}: Error finding container 57495a519692f90d3973d547cef617f42a3df63bc9fa34eed5573cf1120bde28: Status 404 returned error can't find the container with id 57495a519692f90d3973d547cef617f42a3df63bc9fa34eed5573cf1120bde28 Apr 24 21:30:09.570849 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:09.570796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nj6p" event={"ID":"0c465288-2f3f-4fc1-9192-76111e546363","Type":"ContainerStarted","Data":"57495a519692f90d3973d547cef617f42a3df63bc9fa34eed5573cf1120bde28"} Apr 24 21:30:09.572826 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:09.572795 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" event={"ID":"755a83ac-6e0b-4533-8c76-435876e1c64e","Type":"ContainerStarted","Data":"922e87ed660d096f97f62ed6ac71f5b0dfaa08c4b39ff4fe3d8cf00e6aef7d16"} Apr 24 21:30:09.574815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:09.574335 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" event={"ID":"deb27629-83f5-41b3-b267-7644e8af713e","Type":"ContainerStarted","Data":"362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9"} Apr 24 21:30:09.574815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:09.574369 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" event={"ID":"deb27629-83f5-41b3-b267-7644e8af713e","Type":"ContainerStarted","Data":"7d7e1918252512a64fb4974b137efa75a566748648661daf40bc1d56be2b5d82"} Apr 24 21:30:09.574815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:09.574484 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:09.596378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:09.596323 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" podStartSLOduration=162.596306084 podStartE2EDuration="2m42.596306084s" podCreationTimestamp="2026-04-24 21:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:30:09.595279275 +0000 UTC m=+162.018786683" watchObservedRunningTime="2026-04-24 21:30:09.596306084 +0000 UTC m=+162.019813492" Apr 24 21:30:10.577952 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:10.577923 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nj6p" event={"ID":"0c465288-2f3f-4fc1-9192-76111e546363","Type":"ContainerStarted","Data":"70a0fe69dd7df92f1607e07416466942cb132762d5267fdfab580e12b1eed069"} Apr 24 21:30:10.579082 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:10.579058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" event={"ID":"755a83ac-6e0b-4533-8c76-435876e1c64e","Type":"ContainerStarted","Data":"80170e8edf6717c55b6229927d8ce00e3209d9c1c6d60717c67e9f52edca0ac0"} Apr 24 21:30:10.596560 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:10.596520 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xm65x" podStartSLOduration=158.14670598 podStartE2EDuration="2m39.596508214s" podCreationTimestamp="2026-04-24 21:27:31 +0000 UTC" firstStartedPulling="2026-04-24 21:30:08.926112051 +0000 UTC m=+161.349619450" lastFinishedPulling="2026-04-24 21:30:10.375914286 +0000 UTC m=+162.799421684" observedRunningTime="2026-04-24 21:30:10.595818833 +0000 UTC m=+163.019326241" watchObservedRunningTime="2026-04-24 21:30:10.596508214 +0000 UTC m=+163.020015621" Apr 24 21:30:11.382075 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.381999 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lrbvd"] Apr 24 21:30:11.384960 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.384941 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.387365 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.387344 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:30:11.388236 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.388220 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:30:11.388349 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.388276 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:30:11.388349 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.388302 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4xl79\"" Apr 24 21:30:11.388349 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.388328 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:30:11.395381 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.395361 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lrbvd"] Apr 24 21:30:11.481878 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.481851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e457ae3-8c68-4737-9278-09ce86be5d5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.481983 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.481884 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e457ae3-8c68-4737-9278-09ce86be5d5d-data-volume\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.481983 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.481908 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e457ae3-8c68-4737-9278-09ce86be5d5d-crio-socket\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.481983 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.481972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s2mq\" (UniqueName: \"kubernetes.io/projected/7e457ae3-8c68-4737-9278-09ce86be5d5d-kube-api-access-9s2mq\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.482133 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.482010 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e457ae3-8c68-4737-9278-09ce86be5d5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582344 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582317 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e457ae3-8c68-4737-9278-09ce86be5d5d-crio-socket\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582730 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2mq\" (UniqueName: \"kubernetes.io/projected/7e457ae3-8c68-4737-9278-09ce86be5d5d-kube-api-access-9s2mq\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582730 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e457ae3-8c68-4737-9278-09ce86be5d5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582730 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582426 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7e457ae3-8c68-4737-9278-09ce86be5d5d-crio-socket\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582730 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e457ae3-8c68-4737-9278-09ce86be5d5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582730 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e457ae3-8c68-4737-9278-09ce86be5d5d-data-volume\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.582946 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.582870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7e457ae3-8c68-4737-9278-09ce86be5d5d-data-volume\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.583143 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.583121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7e457ae3-8c68-4737-9278-09ce86be5d5d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.583599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.583576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9nj6p" event={"ID":"0c465288-2f3f-4fc1-9192-76111e546363","Type":"ContainerStarted","Data":"48e54ac05266136af9bbfd9b8790f962ffe9dccf32d6b26283ecb1e8195e3b91"} Apr 24 21:30:11.584907 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.584888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7e457ae3-8c68-4737-9278-09ce86be5d5d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.592553 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.592532 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2mq\" (UniqueName: \"kubernetes.io/projected/7e457ae3-8c68-4737-9278-09ce86be5d5d-kube-api-access-9s2mq\") pod \"insights-runtime-extractor-lrbvd\" (UID: \"7e457ae3-8c68-4737-9278-09ce86be5d5d\") " pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.604319 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.604281 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9nj6p" podStartSLOduration=130.166679554 podStartE2EDuration="2m11.604270048s" podCreationTimestamp="2026-04-24 21:28:00 +0000 UTC" firstStartedPulling="2026-04-24 21:30:08.942427522 +0000 UTC m=+161.365934908" lastFinishedPulling="2026-04-24 21:30:10.380018012 +0000 UTC m=+162.803525402" observedRunningTime="2026-04-24 21:30:11.603326042 +0000 UTC m=+164.026833485" watchObservedRunningTime="2026-04-24 21:30:11.604270048 +0000 UTC m=+164.027777454" Apr 24 21:30:11.693610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.693586 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lrbvd" Apr 24 21:30:11.813704 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:11.813678 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lrbvd"] Apr 24 21:30:11.816663 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:30:11.816636 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e457ae3_8c68_4737_9278_09ce86be5d5d.slice/crio-a06c289996baddb3aa1d4507044c28f72cacac4a1ea0464af462a123cb0d3d60 WatchSource:0}: Error finding container a06c289996baddb3aa1d4507044c28f72cacac4a1ea0464af462a123cb0d3d60: Status 404 returned error can't find the container with id a06c289996baddb3aa1d4507044c28f72cacac4a1ea0464af462a123cb0d3d60 Apr 24 21:30:12.587092 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:12.587060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrbvd" event={"ID":"7e457ae3-8c68-4737-9278-09ce86be5d5d","Type":"ContainerStarted","Data":"7ab82163dbf9cce43196ef5ad9889737d15c77af67e0655caab67fc0f5977a0e"} Apr 24 21:30:12.587380 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:12.587100 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrbvd" event={"ID":"7e457ae3-8c68-4737-9278-09ce86be5d5d","Type":"ContainerStarted","Data":"ac50f02590d8365342631594f25f47ef8f5b8189de81ee5ffabd6091b55d9d73"} Apr 24 21:30:12.587380 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:12.587111 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrbvd" event={"ID":"7e457ae3-8c68-4737-9278-09ce86be5d5d","Type":"ContainerStarted","Data":"a06c289996baddb3aa1d4507044c28f72cacac4a1ea0464af462a123cb0d3d60"} Apr 24 21:30:12.587380 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:12.587311 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9nj6p" Apr 24 21:30:14.593319 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:14.593284 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lrbvd" event={"ID":"7e457ae3-8c68-4737-9278-09ce86be5d5d","Type":"ContainerStarted","Data":"4f8a43ff2b976f03927b58e5b9cee2c51b6c753bc81d0a08d3d92e3a9d4b5f92"} Apr 24 21:30:14.620089 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:14.620039 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lrbvd" podStartSLOduration=1.639024235 podStartE2EDuration="3.620011106s" podCreationTimestamp="2026-04-24 21:30:11 +0000 UTC" firstStartedPulling="2026-04-24 21:30:11.872422852 +0000 UTC m=+164.295930238" lastFinishedPulling="2026-04-24 21:30:13.853409723 +0000 UTC m=+166.276917109" observedRunningTime="2026-04-24 21:30:14.618174598 +0000 UTC m=+167.041682004" watchObservedRunningTime="2026-04-24 21:30:14.620011106 +0000 UTC m=+167.043518513" Apr 24 21:30:16.063421 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:16.063389 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:30:18.064569 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.064535 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:30:18.067290 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.067265 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-48pk8\"" Apr 24 21:30:18.075504 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.075478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p4k7x" Apr 24 21:30:18.192984 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.192957 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p4k7x"] Apr 24 21:30:18.195862 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:30:18.195824 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4878b13f_f2f3_43a4_a7ed_f6b5b7c0acb5.slice/crio-1f728421cda60db4c5ec3888a5fde6ffc35a2a56a4e3e51cff1191601cbf413f WatchSource:0}: Error finding container 1f728421cda60db4c5ec3888a5fde6ffc35a2a56a4e3e51cff1191601cbf413f: Status 404 returned error can't find the container with id 1f728421cda60db4c5ec3888a5fde6ffc35a2a56a4e3e51cff1191601cbf413f Apr 24 21:30:18.605412 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.605374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p4k7x" event={"ID":"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5","Type":"ContainerStarted","Data":"1f728421cda60db4c5ec3888a5fde6ffc35a2a56a4e3e51cff1191601cbf413f"} Apr 24 21:30:18.753752 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.753713 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5wq4b"] Apr 24 21:30:18.757260 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.757237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.760286 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.760229 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9mvpt\"" Apr 24 21:30:18.760388 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.760333 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:30:18.760452 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.760412 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:30:18.760978 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.760758 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:30:18.760978 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.760875 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:30:18.761158 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.761103 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:30:18.761492 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.761467 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:30:18.835347 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835316 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-accelerators-collector-config\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835362 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vs8h\" (UniqueName: \"kubernetes.io/projected/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-kube-api-access-4vs8h\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835418 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-root\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835451 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-textfile\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835674 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835503 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-tls\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835674 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-sys\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835674 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835591 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-wtmp\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.835674 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.835611 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-metrics-client-ca\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-accelerators-collector-config\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936603 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vs8h\" (UniqueName: \"kubernetes.io/projected/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-kube-api-access-4vs8h\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936603 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-root\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936603 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936603 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-textfile\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-root\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-tls\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936721 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-sys\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-wtmp\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-metrics-client-ca\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.936837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-sys\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.937174 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.936946 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-wtmp\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.937174 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.937107 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-accelerators-collector-config\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.937464 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.937424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-textfile\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.937558 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.937530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-metrics-client-ca\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.939629 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.939599 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-tls\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.939747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.939646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:18.944644 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:18.944619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vs8h\" (UniqueName: \"kubernetes.io/projected/428c6e21-0255-4e74-bbeb-9e2fbbffbb1c-kube-api-access-4vs8h\") pod \"node-exporter-5wq4b\" (UID: \"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c\") " pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:19.068301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:19.068271 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5wq4b" Apr 24 21:30:19.078445 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:30:19.078420 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428c6e21_0255_4e74_bbeb_9e2fbbffbb1c.slice/crio-1158df8841b211fef6c8744a8784c69ac68d8904b6cb17331c1a2cc5f81c04a1 WatchSource:0}: Error finding container 1158df8841b211fef6c8744a8784c69ac68d8904b6cb17331c1a2cc5f81c04a1: Status 404 returned error can't find the container with id 1158df8841b211fef6c8744a8784c69ac68d8904b6cb17331c1a2cc5f81c04a1 Apr 24 21:30:19.610667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:19.610615 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wq4b" event={"ID":"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c","Type":"ContainerStarted","Data":"1158df8841b211fef6c8744a8784c69ac68d8904b6cb17331c1a2cc5f81c04a1"} Apr 24 21:30:20.614406 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:20.614375 2570 generic.go:358] "Generic (PLEG): container finished" podID="428c6e21-0255-4e74-bbeb-9e2fbbffbb1c" containerID="98b8af4b52e2438126549759c25f61b2dbfb4fe6dd3f336b000b3a96e2e78454" exitCode=0 Apr 24 21:30:20.614815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:20.614463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wq4b" event={"ID":"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c","Type":"ContainerDied","Data":"98b8af4b52e2438126549759c25f61b2dbfb4fe6dd3f336b000b3a96e2e78454"} Apr 24 21:30:20.615656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:20.615636 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p4k7x" event={"ID":"4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5","Type":"ContainerStarted","Data":"b60ff239a203cd73d15f117e7cb50cd59898cf8f1aae30449d65d9c28b9bc742"} Apr 24 21:30:20.649128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:20.649075 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p4k7x" podStartSLOduration=139.159725973 podStartE2EDuration="2m20.649061505s" podCreationTimestamp="2026-04-24 21:28:00 +0000 UTC" firstStartedPulling="2026-04-24 21:30:18.197832414 +0000 UTC m=+170.621339819" lastFinishedPulling="2026-04-24 21:30:19.687167963 +0000 UTC m=+172.110675351" observedRunningTime="2026-04-24 21:30:20.647857934 +0000 UTC m=+173.071365342" watchObservedRunningTime="2026-04-24 21:30:20.649061505 +0000 UTC m=+173.072568911" Apr 24 21:30:21.620452 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:21.620414 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wq4b" event={"ID":"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c","Type":"ContainerStarted","Data":"e54a7c2625318a7d6093126f9c3850d930496582ef20c4bc207b02b4e62f5baa"} Apr 24 21:30:21.620452 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:21.620455 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5wq4b" event={"ID":"428c6e21-0255-4e74-bbeb-9e2fbbffbb1c","Type":"ContainerStarted","Data":"c58f74699aa6b575b718870c3edbf8208d13653a272354345390949b87cb27e0"} Apr 24 21:30:21.643726 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:21.643683 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5wq4b" podStartSLOduration=2.6640659270000002 podStartE2EDuration="3.643670971s" podCreationTimestamp="2026-04-24 21:30:18 +0000 UTC" firstStartedPulling="2026-04-24 21:30:19.080306337 +0000 UTC m=+171.503813722" lastFinishedPulling="2026-04-24 21:30:20.059911382 +0000 UTC m=+172.483418766" observedRunningTime="2026-04-24 21:30:21.641820653 +0000 UTC m=+174.065328085" watchObservedRunningTime="2026-04-24 21:30:21.643670971 +0000 UTC m=+174.067178378" Apr 24 21:30:22.591799 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:22.591769 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9nj6p" Apr 24 21:30:28.764426 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:28.764392 2570 patch_prober.go:28] interesting pod/image-registry-76585dbdfb-6n5xh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:30:28.764809 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:28.764442 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" podUID="deb27629-83f5-41b3-b267-7644e8af713e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:30:30.583713 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:30.583686 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:33.078357 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:33.078322 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76585dbdfb-6n5xh"] Apr 24 21:30:58.097213 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.097139 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" podUID="deb27629-83f5-41b3-b267-7644e8af713e" containerName="registry" containerID="cri-o://362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9" gracePeriod=30 Apr 24 21:30:58.353086 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.353066 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:58.427815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.427777 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-image-registry-private-configuration\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.427969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.427821 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-registry-certificates\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.427969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.427855 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-bound-sa-token\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.427969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.427883 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/deb27629-83f5-41b3-b267-7644e8af713e-ca-trust-extracted\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.427969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.427918 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-trusted-ca\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.427969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.427942 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkd52\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-kube-api-access-tkd52\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.428288 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.428039 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.428288 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.428126 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-installation-pull-secrets\") pod \"deb27629-83f5-41b3-b267-7644e8af713e\" (UID: \"deb27629-83f5-41b3-b267-7644e8af713e\") " Apr 24 21:30:58.428464 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.428430 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:58.428464 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.428449 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:58.430468 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.430443 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:58.430770 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.430744 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:58.430867 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.430799 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:58.430934 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.430922 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-kube-api-access-tkd52" (OuterVolumeSpecName: "kube-api-access-tkd52") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "kube-api-access-tkd52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:58.430990 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.430930 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:58.438309 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.438278 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb27629-83f5-41b3-b267-7644e8af713e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "deb27629-83f5-41b3-b267-7644e8af713e" (UID: "deb27629-83f5-41b3-b267-7644e8af713e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:58.529470 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529436 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-image-registry-private-configuration\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529470 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529463 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-registry-certificates\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529470 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529474 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-bound-sa-token\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529483 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/deb27629-83f5-41b3-b267-7644e8af713e-ca-trust-extracted\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529493 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/deb27629-83f5-41b3-b267-7644e8af713e-trusted-ca\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529503 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkd52\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-kube-api-access-tkd52\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529512 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/deb27629-83f5-41b3-b267-7644e8af713e-registry-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.529685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.529521 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/deb27629-83f5-41b3-b267-7644e8af713e-installation-pull-secrets\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:30:58.719442 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.719409 2570 generic.go:358] "Generic (PLEG): container finished" podID="deb27629-83f5-41b3-b267-7644e8af713e" containerID="362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9" exitCode=0 Apr 24 21:30:58.719620 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.719450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" event={"ID":"deb27629-83f5-41b3-b267-7644e8af713e","Type":"ContainerDied","Data":"362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9"} Apr 24 21:30:58.719620 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.719474 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" event={"ID":"deb27629-83f5-41b3-b267-7644e8af713e","Type":"ContainerDied","Data":"7d7e1918252512a64fb4974b137efa75a566748648661daf40bc1d56be2b5d82"} Apr 24 21:30:58.719620 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.719475 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76585dbdfb-6n5xh" Apr 24 21:30:58.719620 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.719496 2570 scope.go:117] "RemoveContainer" containerID="362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9" Apr 24 21:30:58.727648 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.727629 2570 scope.go:117] "RemoveContainer" containerID="362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9" Apr 24 21:30:58.727921 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:30:58.727899 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9\": container with ID starting with 362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9 not found: ID does not exist" containerID="362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9" Apr 24 21:30:58.727972 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.727930 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9"} err="failed to get container status \"362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9\": rpc error: code = NotFound desc = could not find container \"362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9\": container with ID starting with 362e097eeb8fde6f58e798624089deb80fee9931221e7acdc9fb587675e2fef9 not found: ID does not exist" Apr 24 21:30:58.739767 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.739744 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76585dbdfb-6n5xh"] Apr 24 21:30:58.743861 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:30:58.743838 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-76585dbdfb-6n5xh"] Apr 24 21:31:00.070376 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:00.070343 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb27629-83f5-41b3-b267-7644e8af713e" path="/var/lib/kubelet/pods/deb27629-83f5-41b3-b267-7644e8af713e/volumes" Apr 24 21:31:10.908841 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:10.908800 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" podUID="5e907be1-1dd2-42ef-82cb-1889550f56df" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:31:20.909113 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:20.909072 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" podUID="5e907be1-1dd2-42ef-82cb-1889550f56df" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:31:30.908418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:30.908384 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" podUID="5e907be1-1dd2-42ef-82cb-1889550f56df" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:31:30.908898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:30.908450 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" Apr 24 21:31:30.908898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:30.908854 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"03406f49ea7689b96544e05c65b605281636bfbbadb59fdd95a65a33d20fc0a0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:31:30.908898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:30.908888 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" podUID="5e907be1-1dd2-42ef-82cb-1889550f56df" containerName="service-proxy" containerID="cri-o://03406f49ea7689b96544e05c65b605281636bfbbadb59fdd95a65a33d20fc0a0" gracePeriod=30 Apr 24 21:31:31.807673 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:31.807639 2570 generic.go:358] "Generic (PLEG): container finished" podID="5e907be1-1dd2-42ef-82cb-1889550f56df" containerID="03406f49ea7689b96544e05c65b605281636bfbbadb59fdd95a65a33d20fc0a0" exitCode=2 Apr 24 21:31:31.807835 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:31.807706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" event={"ID":"5e907be1-1dd2-42ef-82cb-1889550f56df","Type":"ContainerDied","Data":"03406f49ea7689b96544e05c65b605281636bfbbadb59fdd95a65a33d20fc0a0"} Apr 24 21:31:31.807835 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:31.807741 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59d5495f66-wbtqs" event={"ID":"5e907be1-1dd2-42ef-82cb-1889550f56df","Type":"ContainerStarted","Data":"8060c90be31a221342fa300287703edca8bd2002db551fa40681d821f45c71a1"} Apr 24 21:31:39.910274 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:39.910241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:31:39.912688 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:39.912670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87439f5a-542b-48ed-980f-a2183de13b6f-metrics-certs\") pod \"network-metrics-daemon-892qf\" (UID: \"87439f5a-542b-48ed-980f-a2183de13b6f\") " pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:31:40.066955 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:40.066930 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgr5d\"" Apr 24 21:31:40.074132 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:40.074117 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892qf" Apr 24 21:31:40.190048 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:40.190002 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-892qf"] Apr 24 21:31:40.192851 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:31:40.192817 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87439f5a_542b_48ed_980f_a2183de13b6f.slice/crio-d98794f94e1ef6f4160f8237a2cbec6966b7475a7659641503a28716c64ca8f5 WatchSource:0}: Error finding container d98794f94e1ef6f4160f8237a2cbec6966b7475a7659641503a28716c64ca8f5: Status 404 returned error can't find the container with id d98794f94e1ef6f4160f8237a2cbec6966b7475a7659641503a28716c64ca8f5 Apr 24 21:31:40.833125 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:40.833050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-892qf" event={"ID":"87439f5a-542b-48ed-980f-a2183de13b6f","Type":"ContainerStarted","Data":"d98794f94e1ef6f4160f8237a2cbec6966b7475a7659641503a28716c64ca8f5"} Apr 24 21:31:41.836673 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:41.836637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-892qf" event={"ID":"87439f5a-542b-48ed-980f-a2183de13b6f","Type":"ContainerStarted","Data":"41aea801ec0830722c20b7a1f1187a7f6a273cb29307381ca4a8f0278512f92b"} Apr 24 21:31:41.836673 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:41.836673 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-892qf" event={"ID":"87439f5a-542b-48ed-980f-a2183de13b6f","Type":"ContainerStarted","Data":"04f4915f5d9675aab5ceb1f525ef7f6abfe92d9f103191d2ec046efe54281b4d"} Apr 24 21:31:41.856298 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:31:41.856244 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-892qf" podStartSLOduration=252.954152491 podStartE2EDuration="4m13.856230501s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:31:40.194560515 +0000 UTC m=+252.618067900" lastFinishedPulling="2026-04-24 21:31:41.096638523 +0000 UTC m=+253.520145910" observedRunningTime="2026-04-24 21:31:41.855185932 +0000 UTC m=+254.278693340" watchObservedRunningTime="2026-04-24 21:31:41.856230501 +0000 UTC m=+254.279737911" Apr 24 21:32:27.976331 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:32:27.976303 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:32:27.976825 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:32:27.976615 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:32:27.982403 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:32:27.982385 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:34:16.869432 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.869399 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-v9bn7"] Apr 24 21:34:16.869822 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.869637 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="deb27629-83f5-41b3-b267-7644e8af713e" containerName="registry" Apr 24 21:34:16.869822 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.869648 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb27629-83f5-41b3-b267-7644e8af713e" containerName="registry" Apr 24 21:34:16.869822 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.869698 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="deb27629-83f5-41b3-b267-7644e8af713e" containerName="registry" Apr 24 21:34:16.872306 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.872291 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:16.875878 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.875857 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:34:16.876010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.875932 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z6l4z\"" Apr 24 21:34:16.876164 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.876152 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:34:16.886514 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.886492 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:34:16.902834 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:16.902809 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-v9bn7"] Apr 24 21:34:17.015629 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.015604 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgvvn\" (UniqueName: \"kubernetes.io/projected/4f14a23f-acdc-436a-b4bc-b44e9ff88871-kube-api-access-hgvvn\") pod \"seaweedfs-86cc847c5c-v9bn7\" (UID: \"4f14a23f-acdc-436a-b4bc-b44e9ff88871\") " pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.015780 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.015643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4f14a23f-acdc-436a-b4bc-b44e9ff88871-data\") pod \"seaweedfs-86cc847c5c-v9bn7\" (UID: \"4f14a23f-acdc-436a-b4bc-b44e9ff88871\") " pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.116147 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.116112 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4f14a23f-acdc-436a-b4bc-b44e9ff88871-data\") pod \"seaweedfs-86cc847c5c-v9bn7\" (UID: \"4f14a23f-acdc-436a-b4bc-b44e9ff88871\") " pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.116300 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.116164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgvvn\" (UniqueName: \"kubernetes.io/projected/4f14a23f-acdc-436a-b4bc-b44e9ff88871-kube-api-access-hgvvn\") pod \"seaweedfs-86cc847c5c-v9bn7\" (UID: \"4f14a23f-acdc-436a-b4bc-b44e9ff88871\") " pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.116538 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.116519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4f14a23f-acdc-436a-b4bc-b44e9ff88871-data\") pod \"seaweedfs-86cc847c5c-v9bn7\" (UID: \"4f14a23f-acdc-436a-b4bc-b44e9ff88871\") " pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.128426 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.128366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgvvn\" (UniqueName: \"kubernetes.io/projected/4f14a23f-acdc-436a-b4bc-b44e9ff88871-kube-api-access-hgvvn\") pod \"seaweedfs-86cc847c5c-v9bn7\" (UID: \"4f14a23f-acdc-436a-b4bc-b44e9ff88871\") " pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.181344 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.181309 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:17.317527 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.317504 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-v9bn7"] Apr 24 21:34:17.320251 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:34:17.320222 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f14a23f_acdc_436a_b4bc_b44e9ff88871.slice/crio-0dd82f72bc2daa2f29790c61eb235a393025e658c62726fcbb801a291a5ff388 WatchSource:0}: Error finding container 0dd82f72bc2daa2f29790c61eb235a393025e658c62726fcbb801a291a5ff388: Status 404 returned error can't find the container with id 0dd82f72bc2daa2f29790c61eb235a393025e658c62726fcbb801a291a5ff388 Apr 24 21:34:17.321848 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:17.321830 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:18.218668 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:18.218634 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-v9bn7" event={"ID":"4f14a23f-acdc-436a-b4bc-b44e9ff88871","Type":"ContainerStarted","Data":"0dd82f72bc2daa2f29790c61eb235a393025e658c62726fcbb801a291a5ff388"} Apr 24 21:34:20.225803 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:20.225771 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-v9bn7" event={"ID":"4f14a23f-acdc-436a-b4bc-b44e9ff88871","Type":"ContainerStarted","Data":"934f7d77a0457678003c0b6225f7cc230219259a69d45f5bdfdc11a71058dc69"} Apr 24 21:34:20.226237 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:20.225884 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:34:20.249686 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:20.249626 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-v9bn7" podStartSLOduration=1.752975027 podStartE2EDuration="4.249607304s" podCreationTimestamp="2026-04-24 21:34:16 +0000 UTC" firstStartedPulling="2026-04-24 21:34:17.321988317 +0000 UTC m=+409.745495703" lastFinishedPulling="2026-04-24 21:34:19.818620595 +0000 UTC m=+412.242127980" observedRunningTime="2026-04-24 21:34:20.246752117 +0000 UTC m=+412.670259524" watchObservedRunningTime="2026-04-24 21:34:20.249607304 +0000 UTC m=+412.673114764" Apr 24 21:34:26.231402 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:34:26.231372 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-v9bn7" Apr 24 21:35:27.844540 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.844513 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-r4gw9"] Apr 24 21:35:27.847671 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.847654 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:27.850481 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.850458 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:35:27.850598 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.850581 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-glsqk\"" Apr 24 21:35:27.855489 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.855468 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-r4gw9"] Apr 24 21:35:27.858875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.858855 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-dtlvr"] Apr 24 21:35:27.861636 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.861616 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:27.863666 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.863646 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:35:27.863749 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.863650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-vlk6d\"" Apr 24 21:35:27.872784 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.872762 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-dtlvr"] Apr 24 21:35:27.891896 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.891876 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3f40481-9db3-448a-97e9-675a97385be5-cert\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:27.891975 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.891906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pmk6\" (UniqueName: \"kubernetes.io/projected/09fda931-98d2-45f5-a0b5-8d8302f3e524-kube-api-access-7pmk6\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:27.891975 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.891927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09fda931-98d2-45f5-a0b5-8d8302f3e524-tls-certs\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:27.892068 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.891999 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg5gt\" (UniqueName: \"kubernetes.io/projected/d3f40481-9db3-448a-97e9-675a97385be5-kube-api-access-dg5gt\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:27.992707 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.992688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3f40481-9db3-448a-97e9-675a97385be5-cert\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:27.992796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.992715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pmk6\" (UniqueName: \"kubernetes.io/projected/09fda931-98d2-45f5-a0b5-8d8302f3e524-kube-api-access-7pmk6\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:27.992796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.992736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09fda931-98d2-45f5-a0b5-8d8302f3e524-tls-certs\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:27.992796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.992777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5gt\" (UniqueName: \"kubernetes.io/projected/d3f40481-9db3-448a-97e9-675a97385be5-kube-api-access-dg5gt\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:27.994981 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.994965 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:35:27.995082 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:27.994989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:35:28.003079 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.003054 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg5gt\" (UniqueName: \"kubernetes.io/projected/d3f40481-9db3-448a-97e9-675a97385be5-kube-api-access-dg5gt\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:28.003079 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.003072 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pmk6\" (UniqueName: \"kubernetes.io/projected/09fda931-98d2-45f5-a0b5-8d8302f3e524-kube-api-access-7pmk6\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:28.003226 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:35:28.003102 2570 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:35:28.003226 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:35:28.003115 2570 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 21:35:28.003226 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:35:28.003140 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3f40481-9db3-448a-97e9-675a97385be5-cert podName:d3f40481-9db3-448a-97e9-675a97385be5 nodeName:}" failed. No retries permitted until 2026-04-24 21:35:28.503128096 +0000 UTC m=+480.926635481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3f40481-9db3-448a-97e9-675a97385be5-cert") pod "odh-model-controller-696fc77849-dtlvr" (UID: "d3f40481-9db3-448a-97e9-675a97385be5") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:35:28.003226 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:35:28.003153 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09fda931-98d2-45f5-a0b5-8d8302f3e524-tls-certs podName:09fda931-98d2-45f5-a0b5-8d8302f3e524 nodeName:}" failed. No retries permitted until 2026-04-24 21:35:28.503147768 +0000 UTC m=+480.926655153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/09fda931-98d2-45f5-a0b5-8d8302f3e524-tls-certs") pod "model-serving-api-86f7b4b499-r4gw9" (UID: "09fda931-98d2-45f5-a0b5-8d8302f3e524") : secret "model-serving-api-tls" not found Apr 24 21:35:28.596234 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.596206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3f40481-9db3-448a-97e9-675a97385be5-cert\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:28.596383 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.596243 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09fda931-98d2-45f5-a0b5-8d8302f3e524-tls-certs\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:28.598759 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.598736 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3f40481-9db3-448a-97e9-675a97385be5-cert\") pod \"odh-model-controller-696fc77849-dtlvr\" (UID: \"d3f40481-9db3-448a-97e9-675a97385be5\") " pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:28.598882 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.598809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/09fda931-98d2-45f5-a0b5-8d8302f3e524-tls-certs\") pod \"model-serving-api-86f7b4b499-r4gw9\" (UID: \"09fda931-98d2-45f5-a0b5-8d8302f3e524\") " pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:28.761106 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.761069 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-glsqk\"" Apr 24 21:35:28.769141 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.769127 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:28.772820 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.772805 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-vlk6d\"" Apr 24 21:35:28.780886 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.780858 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:28.896297 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.896267 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-r4gw9"] Apr 24 21:35:28.899117 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:35:28.899091 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fda931_98d2_45f5_a0b5_8d8302f3e524.slice/crio-f1aa2d7e6e86aa8d27a65f62c9c1d9bf6b6f4da282dccda392b315794699160d WatchSource:0}: Error finding container f1aa2d7e6e86aa8d27a65f62c9c1d9bf6b6f4da282dccda392b315794699160d: Status 404 returned error can't find the container with id f1aa2d7e6e86aa8d27a65f62c9c1d9bf6b6f4da282dccda392b315794699160d Apr 24 21:35:28.913214 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:28.913183 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-dtlvr"] Apr 24 21:35:28.915468 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:35:28.915437 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f40481_9db3_448a_97e9_675a97385be5.slice/crio-8cc2945eacaeb5a6c4c1aa292523d98dd5332f44c60b08303e5b33cb29047140 WatchSource:0}: Error finding container 8cc2945eacaeb5a6c4c1aa292523d98dd5332f44c60b08303e5b33cb29047140: Status 404 returned error can't find the container with id 8cc2945eacaeb5a6c4c1aa292523d98dd5332f44c60b08303e5b33cb29047140 Apr 24 21:35:29.395351 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:29.395303 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-dtlvr" event={"ID":"d3f40481-9db3-448a-97e9-675a97385be5","Type":"ContainerStarted","Data":"8cc2945eacaeb5a6c4c1aa292523d98dd5332f44c60b08303e5b33cb29047140"} Apr 24 21:35:29.396490 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:29.396462 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-r4gw9" event={"ID":"09fda931-98d2-45f5-a0b5-8d8302f3e524","Type":"ContainerStarted","Data":"f1aa2d7e6e86aa8d27a65f62c9c1d9bf6b6f4da282dccda392b315794699160d"} Apr 24 21:35:33.409688 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:33.409648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-dtlvr" event={"ID":"d3f40481-9db3-448a-97e9-675a97385be5","Type":"ContainerStarted","Data":"c17ff2f3e084474a01c05e647fc092b07d98e65cbb62b89fedc5be7c1f7ceb95"} Apr 24 21:35:33.410132 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:33.409789 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:33.410985 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:33.410961 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-r4gw9" event={"ID":"09fda931-98d2-45f5-a0b5-8d8302f3e524","Type":"ContainerStarted","Data":"7f9e856c540c5f4bd1f780ebcbb7e1eb63b15aad3a482ef6986b17c264c751d8"} Apr 24 21:35:33.411096 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:33.411063 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:33.426555 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:33.426506 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-dtlvr" podStartSLOduration=2.856972227 podStartE2EDuration="6.42649214s" podCreationTimestamp="2026-04-24 21:35:27 +0000 UTC" firstStartedPulling="2026-04-24 21:35:28.916696846 +0000 UTC m=+481.340204231" lastFinishedPulling="2026-04-24 21:35:32.486216758 +0000 UTC m=+484.909724144" observedRunningTime="2026-04-24 21:35:33.424799523 +0000 UTC m=+485.848306941" watchObservedRunningTime="2026-04-24 21:35:33.42649214 +0000 UTC m=+485.849999553" Apr 24 21:35:33.441418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:33.441375 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-r4gw9" podStartSLOduration=2.803507349 podStartE2EDuration="6.441364363s" podCreationTimestamp="2026-04-24 21:35:27 +0000 UTC" firstStartedPulling="2026-04-24 21:35:28.900672176 +0000 UTC m=+481.324179576" lastFinishedPulling="2026-04-24 21:35:32.538529181 +0000 UTC m=+484.962036590" observedRunningTime="2026-04-24 21:35:33.440577371 +0000 UTC m=+485.864084774" watchObservedRunningTime="2026-04-24 21:35:33.441364363 +0000 UTC m=+485.864871773" Apr 24 21:35:44.418263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:44.418233 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-dtlvr" Apr 24 21:35:44.420341 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:44.420321 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-r4gw9" Apr 24 21:35:56.278045 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.277976 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k"] Apr 24 21:35:56.280815 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.280790 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.282938 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.282918 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:35:56.289530 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.289510 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k"] Apr 24 21:35:56.381633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.381610 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09c204af-69e8-454f-8917-1fcf4a00eaec-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-j4b4k\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.381725 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.381655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpc8\" (UniqueName: \"kubernetes.io/projected/09c204af-69e8-454f-8917-1fcf4a00eaec-kube-api-access-2mpc8\") pod \"seaweedfs-tls-custom-ddd4dbfd-j4b4k\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.482879 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.482853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpc8\" (UniqueName: \"kubernetes.io/projected/09c204af-69e8-454f-8917-1fcf4a00eaec-kube-api-access-2mpc8\") pod \"seaweedfs-tls-custom-ddd4dbfd-j4b4k\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.482965 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.482895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09c204af-69e8-454f-8917-1fcf4a00eaec-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-j4b4k\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.483189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.483175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09c204af-69e8-454f-8917-1fcf4a00eaec-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-j4b4k\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.491484 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.491468 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpc8\" (UniqueName: \"kubernetes.io/projected/09c204af-69e8-454f-8917-1fcf4a00eaec-kube-api-access-2mpc8\") pod \"seaweedfs-tls-custom-ddd4dbfd-j4b4k\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.589594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.589545 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:35:56.701797 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:56.701759 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k"] Apr 24 21:35:56.705905 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:35:56.705880 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09c204af_69e8_454f_8917_1fcf4a00eaec.slice/crio-055bb1ad82ce9f352be957e3c10c22a19ae975af377e21941e1fd4c54d10a0c0 WatchSource:0}: Error finding container 055bb1ad82ce9f352be957e3c10c22a19ae975af377e21941e1fd4c54d10a0c0: Status 404 returned error can't find the container with id 055bb1ad82ce9f352be957e3c10c22a19ae975af377e21941e1fd4c54d10a0c0 Apr 24 21:35:57.470678 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:57.470643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" event={"ID":"09c204af-69e8-454f-8917-1fcf4a00eaec","Type":"ContainerStarted","Data":"f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3"} Apr 24 21:35:57.470678 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:57.470675 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" event={"ID":"09c204af-69e8-454f-8917-1fcf4a00eaec","Type":"ContainerStarted","Data":"055bb1ad82ce9f352be957e3c10c22a19ae975af377e21941e1fd4c54d10a0c0"} Apr 24 21:35:57.486543 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:57.486493 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" podStartSLOduration=1.246628294 podStartE2EDuration="1.486478227s" podCreationTimestamp="2026-04-24 21:35:56 +0000 UTC" firstStartedPulling="2026-04-24 21:35:56.707335396 +0000 UTC m=+509.130842781" lastFinishedPulling="2026-04-24 21:35:56.947185328 +0000 UTC m=+509.370692714" observedRunningTime="2026-04-24 21:35:57.485672638 +0000 UTC m=+509.909180045" watchObservedRunningTime="2026-04-24 21:35:57.486478227 +0000 UTC m=+509.909985635" Apr 24 21:35:58.016617 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:58.016588 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k"] Apr 24 21:35:59.475326 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:35:59.475269 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" podUID="09c204af-69e8-454f-8917-1fcf4a00eaec" containerName="seaweedfs-tls-custom" containerID="cri-o://f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3" gracePeriod=30 Apr 24 21:36:00.702631 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.702607 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:36:00.811898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.811838 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09c204af-69e8-454f-8917-1fcf4a00eaec-data\") pod \"09c204af-69e8-454f-8917-1fcf4a00eaec\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " Apr 24 21:36:00.811999 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.811916 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mpc8\" (UniqueName: \"kubernetes.io/projected/09c204af-69e8-454f-8917-1fcf4a00eaec-kube-api-access-2mpc8\") pod \"09c204af-69e8-454f-8917-1fcf4a00eaec\" (UID: \"09c204af-69e8-454f-8917-1fcf4a00eaec\") " Apr 24 21:36:00.813155 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.813133 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09c204af-69e8-454f-8917-1fcf4a00eaec-data" (OuterVolumeSpecName: "data") pod "09c204af-69e8-454f-8917-1fcf4a00eaec" (UID: "09c204af-69e8-454f-8917-1fcf4a00eaec"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:00.814055 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.814035 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c204af-69e8-454f-8917-1fcf4a00eaec-kube-api-access-2mpc8" (OuterVolumeSpecName: "kube-api-access-2mpc8") pod "09c204af-69e8-454f-8917-1fcf4a00eaec" (UID: "09c204af-69e8-454f-8917-1fcf4a00eaec"). InnerVolumeSpecName "kube-api-access-2mpc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:00.912316 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.912293 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mpc8\" (UniqueName: \"kubernetes.io/projected/09c204af-69e8-454f-8917-1fcf4a00eaec-kube-api-access-2mpc8\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:36:00.912316 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:00.912314 2570 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09c204af-69e8-454f-8917-1fcf4a00eaec-data\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:36:01.481342 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.481314 2570 generic.go:358] "Generic (PLEG): container finished" podID="09c204af-69e8-454f-8917-1fcf4a00eaec" containerID="f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3" exitCode=0 Apr 24 21:36:01.481497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.481363 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" event={"ID":"09c204af-69e8-454f-8917-1fcf4a00eaec","Type":"ContainerDied","Data":"f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3"} Apr 24 21:36:01.481497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.481370 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" Apr 24 21:36:01.481497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.481384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k" event={"ID":"09c204af-69e8-454f-8917-1fcf4a00eaec","Type":"ContainerDied","Data":"055bb1ad82ce9f352be957e3c10c22a19ae975af377e21941e1fd4c54d10a0c0"} Apr 24 21:36:01.481497 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.481398 2570 scope.go:117] "RemoveContainer" containerID="f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3" Apr 24 21:36:01.490607 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.490592 2570 scope.go:117] "RemoveContainer" containerID="f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3" Apr 24 21:36:01.490875 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:36:01.490858 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3\": container with ID starting with f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3 not found: ID does not exist" containerID="f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3" Apr 24 21:36:01.490918 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.490884 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3"} err="failed to get container status \"f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3\": rpc error: code = NotFound desc = could not find container \"f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3\": container with ID starting with f93765e5c239e9e05473221dc12eab805e0e7a7a7b19ce87913e1a70f89f14a3 not found: ID does not exist" Apr 24 21:36:01.503747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.503726 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k"] Apr 24 21:36:01.508955 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.508930 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-j4b4k"] Apr 24 21:36:01.546348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.546330 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n"] Apr 24 21:36:01.546552 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.546541 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09c204af-69e8-454f-8917-1fcf4a00eaec" containerName="seaweedfs-tls-custom" Apr 24 21:36:01.546600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.546553 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c204af-69e8-454f-8917-1fcf4a00eaec" containerName="seaweedfs-tls-custom" Apr 24 21:36:01.546600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.546598 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="09c204af-69e8-454f-8917-1fcf4a00eaec" containerName="seaweedfs-tls-custom" Apr 24 21:36:01.549186 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.549163 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.551488 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.551470 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 21:36:01.551737 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.551713 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 21:36:01.559671 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.559654 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n"] Apr 24 21:36:01.618422 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.618395 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/c778c90e-bec7-4db6-83cd-81ff3c204a34-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.618518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.618434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c778c90e-bec7-4db6-83cd-81ff3c204a34-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.618518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.618466 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmnzp\" (UniqueName: \"kubernetes.io/projected/c778c90e-bec7-4db6-83cd-81ff3c204a34-kube-api-access-fmnzp\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.719221 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.719198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/c778c90e-bec7-4db6-83cd-81ff3c204a34-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.719543 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.719234 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c778c90e-bec7-4db6-83cd-81ff3c204a34-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.719543 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.719268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmnzp\" (UniqueName: \"kubernetes.io/projected/c778c90e-bec7-4db6-83cd-81ff3c204a34-kube-api-access-fmnzp\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.719625 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.719583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c778c90e-bec7-4db6-83cd-81ff3c204a34-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.721714 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.721693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/c778c90e-bec7-4db6-83cd-81ff3c204a34-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.728551 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.728529 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmnzp\" (UniqueName: \"kubernetes.io/projected/c778c90e-bec7-4db6-83cd-81ff3c204a34-kube-api-access-fmnzp\") pod \"seaweedfs-tls-custom-5c88b85bb7-5gk7n\" (UID: \"c778c90e-bec7-4db6-83cd-81ff3c204a34\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.857553 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.857502 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" Apr 24 21:36:01.971766 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:01.971728 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n"] Apr 24 21:36:01.976660 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:36:01.976622 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc778c90e_bec7_4db6_83cd_81ff3c204a34.slice/crio-fd60dc539daa485d753e61952b033c41f0305ce0d5448d39a442c35bfd88d0c6 WatchSource:0}: Error finding container fd60dc539daa485d753e61952b033c41f0305ce0d5448d39a442c35bfd88d0c6: Status 404 returned error can't find the container with id fd60dc539daa485d753e61952b033c41f0305ce0d5448d39a442c35bfd88d0c6 Apr 24 21:36:02.066262 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:02.066240 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c204af-69e8-454f-8917-1fcf4a00eaec" path="/var/lib/kubelet/pods/09c204af-69e8-454f-8917-1fcf4a00eaec/volumes" Apr 24 21:36:02.487138 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:02.487106 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" event={"ID":"c778c90e-bec7-4db6-83cd-81ff3c204a34","Type":"ContainerStarted","Data":"e90c38e67057cfcdc162ec09d17b27fd7819af7f9ecc8ba9a866158c72747d8d"} Apr 24 21:36:02.487138 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:02.487140 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" event={"ID":"c778c90e-bec7-4db6-83cd-81ff3c204a34","Type":"ContainerStarted","Data":"fd60dc539daa485d753e61952b033c41f0305ce0d5448d39a442c35bfd88d0c6"} Apr 24 21:36:02.502847 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:02.502774 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-5gk7n" podStartSLOduration=1.241268496 podStartE2EDuration="1.5027601s" podCreationTimestamp="2026-04-24 21:36:01 +0000 UTC" firstStartedPulling="2026-04-24 21:36:01.977992649 +0000 UTC m=+514.401500038" lastFinishedPulling="2026-04-24 21:36:02.239484257 +0000 UTC m=+514.662991642" observedRunningTime="2026-04-24 21:36:02.501723017 +0000 UTC m=+514.925230424" watchObservedRunningTime="2026-04-24 21:36:02.5027601 +0000 UTC m=+514.926267504" Apr 24 21:36:11.250484 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.250406 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd"] Apr 24 21:36:11.253383 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.253368 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.256000 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.255975 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 21:36:11.256000 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.255989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 21:36:11.263669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.263648 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd"] Apr 24 21:36:11.390128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.390098 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjkm7\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-kube-api-access-rjkm7\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.390269 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.390142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.390269 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.390177 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3bafcdab-0482-4bf0-bf70-c504030e1650-data\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.491266 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.491239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjkm7\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-kube-api-access-rjkm7\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.491382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.491269 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.491382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.491299 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3bafcdab-0482-4bf0-bf70-c504030e1650-data\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.491503 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:36:11.491394 2570 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 24 21:36:11.491503 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:36:11.491414 2570 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd: secret "seaweedfs-tls-serving" not found Apr 24 21:36:11.491503 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:36:11.491488 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-seaweedfs-tls-serving podName:3bafcdab-0482-4bf0-bf70-c504030e1650 nodeName:}" failed. No retries permitted until 2026-04-24 21:36:11.991467108 +0000 UTC m=+524.414974510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-gcpxd" (UID: "3bafcdab-0482-4bf0-bf70-c504030e1650") : secret "seaweedfs-tls-serving" not found Apr 24 21:36:11.491667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.491584 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/3bafcdab-0482-4bf0-bf70-c504030e1650-data\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.504062 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.503999 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjkm7\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-kube-api-access-rjkm7\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.995079 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.995051 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:11.997484 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:11.997463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/3bafcdab-0482-4bf0-bf70-c504030e1650-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-gcpxd\" (UID: \"3bafcdab-0482-4bf0-bf70-c504030e1650\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:12.162347 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:12.162323 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" Apr 24 21:36:12.279769 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:12.279698 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd"] Apr 24 21:36:12.282673 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:36:12.282637 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bafcdab_0482_4bf0_bf70_c504030e1650.slice/crio-19f4698a3415c1236c4b3efc3c40cfbeeed42e10a21868fe6540e8a27100d3b8 WatchSource:0}: Error finding container 19f4698a3415c1236c4b3efc3c40cfbeeed42e10a21868fe6540e8a27100d3b8: Status 404 returned error can't find the container with id 19f4698a3415c1236c4b3efc3c40cfbeeed42e10a21868fe6540e8a27100d3b8 Apr 24 21:36:12.515018 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:12.514986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" event={"ID":"3bafcdab-0482-4bf0-bf70-c504030e1650","Type":"ContainerStarted","Data":"19f4698a3415c1236c4b3efc3c40cfbeeed42e10a21868fe6540e8a27100d3b8"} Apr 24 21:36:13.519167 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:13.519125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" event={"ID":"3bafcdab-0482-4bf0-bf70-c504030e1650","Type":"ContainerStarted","Data":"601008a6a88dfe05141dcf15b06c31068654e0dd6dfed684d37af3f82db7181f"} Apr 24 21:36:13.536796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:13.536749 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-gcpxd" podStartSLOduration=2.280587483 podStartE2EDuration="2.536734847s" podCreationTimestamp="2026-04-24 21:36:11 +0000 UTC" firstStartedPulling="2026-04-24 21:36:12.283809248 +0000 UTC m=+524.707316633" lastFinishedPulling="2026-04-24 21:36:12.53995661 +0000 UTC m=+524.963463997" observedRunningTime="2026-04-24 21:36:13.535702373 +0000 UTC m=+525.959209781" watchObservedRunningTime="2026-04-24 21:36:13.536734847 +0000 UTC m=+525.960242254" Apr 24 21:36:14.052368 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.052334 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-gvthv"] Apr 24 21:36:14.056445 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.056426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:14.067166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.067144 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-gvthv"] Apr 24 21:36:14.211643 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.211613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4h8\" (UniqueName: \"kubernetes.io/projected/c98e1427-1554-4779-bd5e-b7aece1c6da5-kube-api-access-bc4h8\") pod \"s3-tls-init-serving-gvthv\" (UID: \"c98e1427-1554-4779-bd5e-b7aece1c6da5\") " pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:14.312940 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.312886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4h8\" (UniqueName: \"kubernetes.io/projected/c98e1427-1554-4779-bd5e-b7aece1c6da5-kube-api-access-bc4h8\") pod \"s3-tls-init-serving-gvthv\" (UID: \"c98e1427-1554-4779-bd5e-b7aece1c6da5\") " pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:14.321298 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.321275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4h8\" (UniqueName: \"kubernetes.io/projected/c98e1427-1554-4779-bd5e-b7aece1c6da5-kube-api-access-bc4h8\") pod \"s3-tls-init-serving-gvthv\" (UID: \"c98e1427-1554-4779-bd5e-b7aece1c6da5\") " pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:14.365906 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.365887 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:14.479511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.479483 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-gvthv"] Apr 24 21:36:14.482226 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:36:14.482199 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc98e1427_1554_4779_bd5e_b7aece1c6da5.slice/crio-0e78912a79a40d9670c5aab53f5e7ce0d7dcd08b42bb747d49588ede7c3e9ef7 WatchSource:0}: Error finding container 0e78912a79a40d9670c5aab53f5e7ce0d7dcd08b42bb747d49588ede7c3e9ef7: Status 404 returned error can't find the container with id 0e78912a79a40d9670c5aab53f5e7ce0d7dcd08b42bb747d49588ede7c3e9ef7 Apr 24 21:36:14.522909 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:14.522877 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-gvthv" event={"ID":"c98e1427-1554-4779-bd5e-b7aece1c6da5","Type":"ContainerStarted","Data":"0e78912a79a40d9670c5aab53f5e7ce0d7dcd08b42bb747d49588ede7c3e9ef7"} Apr 24 21:36:19.539237 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:19.539201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-gvthv" event={"ID":"c98e1427-1554-4779-bd5e-b7aece1c6da5","Type":"ContainerStarted","Data":"8b5a49dd79a77a6341c40da59bb9805c46549febd146f071c5a7900d0f60e904"} Apr 24 21:36:19.557940 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:19.557893 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-gvthv" podStartSLOduration=1.17933067 podStartE2EDuration="5.557878756s" podCreationTimestamp="2026-04-24 21:36:14 +0000 UTC" firstStartedPulling="2026-04-24 21:36:14.48404289 +0000 UTC m=+526.907550275" lastFinishedPulling="2026-04-24 21:36:18.862590973 +0000 UTC m=+531.286098361" observedRunningTime="2026-04-24 21:36:19.556568779 +0000 UTC m=+531.980076185" watchObservedRunningTime="2026-04-24 21:36:19.557878756 +0000 UTC m=+531.981386163" Apr 24 21:36:24.554206 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:24.554172 2570 generic.go:358] "Generic (PLEG): container finished" podID="c98e1427-1554-4779-bd5e-b7aece1c6da5" containerID="8b5a49dd79a77a6341c40da59bb9805c46549febd146f071c5a7900d0f60e904" exitCode=0 Apr 24 21:36:24.554560 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:24.554236 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-gvthv" event={"ID":"c98e1427-1554-4779-bd5e-b7aece1c6da5","Type":"ContainerDied","Data":"8b5a49dd79a77a6341c40da59bb9805c46549febd146f071c5a7900d0f60e904"} Apr 24 21:36:25.680969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:25.680944 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:25.796219 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:25.796194 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc4h8\" (UniqueName: \"kubernetes.io/projected/c98e1427-1554-4779-bd5e-b7aece1c6da5-kube-api-access-bc4h8\") pod \"c98e1427-1554-4779-bd5e-b7aece1c6da5\" (UID: \"c98e1427-1554-4779-bd5e-b7aece1c6da5\") " Apr 24 21:36:25.798306 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:25.798284 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98e1427-1554-4779-bd5e-b7aece1c6da5-kube-api-access-bc4h8" (OuterVolumeSpecName: "kube-api-access-bc4h8") pod "c98e1427-1554-4779-bd5e-b7aece1c6da5" (UID: "c98e1427-1554-4779-bd5e-b7aece1c6da5"). InnerVolumeSpecName "kube-api-access-bc4h8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:25.896785 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:25.896738 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bc4h8\" (UniqueName: \"kubernetes.io/projected/c98e1427-1554-4779-bd5e-b7aece1c6da5-kube-api-access-bc4h8\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:36:26.560535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:26.560504 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-gvthv" event={"ID":"c98e1427-1554-4779-bd5e-b7aece1c6da5","Type":"ContainerDied","Data":"0e78912a79a40d9670c5aab53f5e7ce0d7dcd08b42bb747d49588ede7c3e9ef7"} Apr 24 21:36:26.560535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:26.560534 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e78912a79a40d9670c5aab53f5e7ce0d7dcd08b42bb747d49588ede7c3e9ef7" Apr 24 21:36:26.560723 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:26.560545 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-gvthv" Apr 24 21:36:35.692150 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.692121 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w"] Apr 24 21:36:35.692574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.692378 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c98e1427-1554-4779-bd5e-b7aece1c6da5" containerName="s3-tls-init-serving" Apr 24 21:36:35.692574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.692389 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e1427-1554-4779-bd5e-b7aece1c6da5" containerName="s3-tls-init-serving" Apr 24 21:36:35.692574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.692437 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c98e1427-1554-4779-bd5e-b7aece1c6da5" containerName="s3-tls-init-serving" Apr 24 21:36:35.698413 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.698386 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.712557 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.712527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qlzl5\"" Apr 24 21:36:35.712678 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.712600 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 24 21:36:35.712678 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.712617 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:36:35.712800 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.712705 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:36:35.714857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.714791 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 24 21:36:35.715156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.715136 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w"] Apr 24 21:36:35.861848 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.861818 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt79n\" (UniqueName: \"kubernetes.io/projected/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kube-api-access-nt79n\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.861967 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.861871 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.861967 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.861900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.861967 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.861942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.962262 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.962210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nt79n\" (UniqueName: \"kubernetes.io/projected/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kube-api-access-nt79n\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.962262 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.962239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.962382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.962261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.962382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.962306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.962768 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.962742 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.962950 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.962931 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.964804 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.964787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:35.971177 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:35.971151 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt79n\" (UniqueName: \"kubernetes.io/projected/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kube-api-access-nt79n\") pod \"isvc-sklearn-batcher-predictor-67598ff486-n584w\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:36.009982 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:36.009963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:36:36.130843 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:36.130787 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w"] Apr 24 21:36:36.135308 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:36:36.135280 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c6bcc7_6504_4840_8c68_a1f58727fd5a.slice/crio-07e6a77787848c55eb70177f91637d2a35bfb430137b4a49e5a6edc97566246a WatchSource:0}: Error finding container 07e6a77787848c55eb70177f91637d2a35bfb430137b4a49e5a6edc97566246a: Status 404 returned error can't find the container with id 07e6a77787848c55eb70177f91637d2a35bfb430137b4a49e5a6edc97566246a Apr 24 21:36:36.586647 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:36.586617 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerStarted","Data":"07e6a77787848c55eb70177f91637d2a35bfb430137b4a49e5a6edc97566246a"} Apr 24 21:36:40.597929 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:40.597888 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerStarted","Data":"386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53"} Apr 24 21:36:44.610394 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:44.610361 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerID="386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53" exitCode=0 Apr 24 21:36:44.610863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:44.610436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerDied","Data":"386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53"} Apr 24 21:36:58.659253 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:36:58.659214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerStarted","Data":"cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127"} Apr 24 21:37:00.666531 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:00.666500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerStarted","Data":"6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258"} Apr 24 21:37:03.676597 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:03.676560 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerStarted","Data":"bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7"} Apr 24 21:37:03.676962 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:03.676798 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:37:03.676962 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:03.676914 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:37:03.678058 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:03.678013 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:03.697673 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:03.697627 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podStartSLOduration=1.794014986 podStartE2EDuration="28.697611499s" podCreationTimestamp="2026-04-24 21:36:35 +0000 UTC" firstStartedPulling="2026-04-24 21:36:36.137231622 +0000 UTC m=+548.560739007" lastFinishedPulling="2026-04-24 21:37:03.040828135 +0000 UTC m=+575.464335520" observedRunningTime="2026-04-24 21:37:03.695775247 +0000 UTC m=+576.119282654" watchObservedRunningTime="2026-04-24 21:37:03.697611499 +0000 UTC m=+576.121118907" Apr 24 21:37:04.679885 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:04.679853 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:37:04.680278 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:04.679964 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:04.680706 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:04.680682 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:05.682399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:05.682354 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:05.682840 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:05.682759 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:05.685904 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:05.685886 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:37:06.685284 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:06.685248 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:06.685733 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:06.685612 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:16.685496 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:16.685444 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:16.686011 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:16.685806 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:26.685459 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:26.685412 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:26.685899 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:26.685877 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:28.003857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:28.003831 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:37:28.006544 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:28.006522 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:37:36.685584 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:36.685541 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:36.685989 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:36.685963 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:46.685198 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:46.685105 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:46.685624 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:46.685501 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:56.685221 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:56.685174 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:37:56.685740 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:37:56.685715 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:06.686256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:06.686220 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:06.688433 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:06.686311 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:20.563854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.563809 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w"] Apr 24 21:38:20.564377 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.564196 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" containerID="cri-o://cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127" gracePeriod=30 Apr 24 21:38:20.564377 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.564243 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" containerID="cri-o://6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258" gracePeriod=30 Apr 24 21:38:20.564489 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.564369 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" containerID="cri-o://bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7" gracePeriod=30 Apr 24 21:38:20.682655 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.682570 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:38:20.687939 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.684671 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d"] Apr 24 21:38:20.689529 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.689508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.691774 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.691752 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 24 21:38:20.691880 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.691858 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 24 21:38:20.697135 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.697115 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d"] Apr 24 21:38:20.736576 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.736555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.736658 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.736587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmxhl\" (UniqueName: \"kubernetes.io/projected/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kube-api-access-gmxhl\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.736658 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.736652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.736746 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.736676 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.837314 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.837243 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.837314 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.837287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.837314 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.837312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmxhl\" (UniqueName: \"kubernetes.io/projected/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kube-api-access-gmxhl\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.837543 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.837356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.837543 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:38:20.837455 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-serving-cert: secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 24 21:38:20.837543 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:38:20.837522 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls podName:44509c85-d6bd-4967-8f7c-7f9cc7096bfb nodeName:}" failed. No retries permitted until 2026-04-24 21:38:21.337502416 +0000 UTC m=+653.761009802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls") pod "isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" (UID: "44509c85-d6bd-4967-8f7c-7f9cc7096bfb") : secret "isvc-sklearn-batcher-custom-predictor-serving-cert" not found Apr 24 21:38:20.837788 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.837770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.837993 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.837977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.845777 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.845758 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmxhl\" (UniqueName: \"kubernetes.io/projected/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kube-api-access-gmxhl\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:20.892483 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.892455 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerID="6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258" exitCode=2 Apr 24 21:38:20.892607 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:20.892532 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerDied","Data":"6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258"} Apr 24 21:38:21.341539 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:21.341501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:21.344055 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:21.344010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:21.600303 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:21.600220 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:21.726056 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:21.726009 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d"] Apr 24 21:38:21.729988 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:38:21.729959 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44509c85_d6bd_4967_8f7c_7f9cc7096bfb.slice/crio-f6bd2d48bd01109b5b9538aeee5140fa06287a9c690ac8438a292a18be57ddc7 WatchSource:0}: Error finding container f6bd2d48bd01109b5b9538aeee5140fa06287a9c690ac8438a292a18be57ddc7: Status 404 returned error can't find the container with id f6bd2d48bd01109b5b9538aeee5140fa06287a9c690ac8438a292a18be57ddc7 Apr 24 21:38:21.897756 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:21.897657 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerStarted","Data":"c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b"} Apr 24 21:38:21.897756 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:21.897705 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerStarted","Data":"f6bd2d48bd01109b5b9538aeee5140fa06287a9c690ac8438a292a18be57ddc7"} Apr 24 21:38:24.907394 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:24.907311 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerID="cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127" exitCode=0 Apr 24 21:38:24.907394 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:24.907379 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerDied","Data":"cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127"} Apr 24 21:38:25.682719 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:25.682676 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:38:25.910989 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:25.910959 2570 generic.go:358] "Generic (PLEG): container finished" podID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerID="c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b" exitCode=0 Apr 24 21:38:25.911364 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:25.911007 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerDied","Data":"c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b"} Apr 24 21:38:26.685524 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.685485 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:38:26.685822 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.685796 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:26.915536 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.915503 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerStarted","Data":"645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a"} Apr 24 21:38:26.915536 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.915541 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerStarted","Data":"72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375"} Apr 24 21:38:26.915901 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.915551 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerStarted","Data":"482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78"} Apr 24 21:38:26.915901 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.915746 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:26.938630 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:26.938544 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podStartSLOduration=6.938532496 podStartE2EDuration="6.938532496s" podCreationTimestamp="2026-04-24 21:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:26.936749158 +0000 UTC m=+659.360256566" watchObservedRunningTime="2026-04-24 21:38:26.938532496 +0000 UTC m=+659.362039902" Apr 24 21:38:27.919633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:27.919599 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:27.919633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:27.919635 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:27.920997 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:27.920959 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:38:27.921662 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:27.921636 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:28.921838 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:28.921799 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:38:28.922312 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:28.922287 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:30.683239 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:30.683200 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:38:30.683630 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:30.683326 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:33.926043 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:33.926000 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:38:33.926634 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:33.926604 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:38:33.927017 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:33.926980 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:35.682528 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:35.682490 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:38:36.685365 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:36.685314 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:38:36.685784 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:36.685611 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:40.682960 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:40.682919 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:38:43.926556 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:43.926517 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:38:43.927017 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:43.926830 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:45.682581 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:45.682535 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": dial tcp 10.133.0.21:8643: connect: connection refused" Apr 24 21:38:46.685585 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:46.685533 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 24 21:38:46.686016 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:46.685691 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:46.686016 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:46.685888 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:46.686016 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:46.685970 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:50.704386 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.704364 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:50.765736 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.765708 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " Apr 24 21:38:50.765898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.765748 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-proxy-tls\") pod \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " Apr 24 21:38:50.765898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.765776 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt79n\" (UniqueName: \"kubernetes.io/projected/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kube-api-access-nt79n\") pod \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " Apr 24 21:38:50.765898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.765795 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kserve-provision-location\") pod \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\" (UID: \"b5c6bcc7-6504-4840-8c68-a1f58727fd5a\") " Apr 24 21:38:50.766160 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.766132 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "b5c6bcc7-6504-4840-8c68-a1f58727fd5a" (UID: "b5c6bcc7-6504-4840-8c68-a1f58727fd5a"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:38:50.766160 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.766144 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5c6bcc7-6504-4840-8c68-a1f58727fd5a" (UID: "b5c6bcc7-6504-4840-8c68-a1f58727fd5a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:50.767977 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.767947 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5c6bcc7-6504-4840-8c68-a1f58727fd5a" (UID: "b5c6bcc7-6504-4840-8c68-a1f58727fd5a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:50.768104 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.767996 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kube-api-access-nt79n" (OuterVolumeSpecName: "kube-api-access-nt79n") pod "b5c6bcc7-6504-4840-8c68-a1f58727fd5a" (UID: "b5c6bcc7-6504-4840-8c68-a1f58727fd5a"). InnerVolumeSpecName "kube-api-access-nt79n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:50.866466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.866405 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:38:50.866466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.866430 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:38:50.866466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.866441 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:38:50.866466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.866450 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nt79n\" (UniqueName: \"kubernetes.io/projected/b5c6bcc7-6504-4840-8c68-a1f58727fd5a-kube-api-access-nt79n\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:38:50.995649 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.995621 2570 generic.go:358] "Generic (PLEG): container finished" podID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerID="bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7" exitCode=0 Apr 24 21:38:50.995792 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.995669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerDied","Data":"bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7"} Apr 24 21:38:50.995792 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.995691 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" event={"ID":"b5c6bcc7-6504-4840-8c68-a1f58727fd5a","Type":"ContainerDied","Data":"07e6a77787848c55eb70177f91637d2a35bfb430137b4a49e5a6edc97566246a"} Apr 24 21:38:50.995792 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.995705 2570 scope.go:117] "RemoveContainer" containerID="bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7" Apr 24 21:38:50.995792 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:50.995726 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" Apr 24 21:38:51.004050 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.004009 2570 scope.go:117] "RemoveContainer" containerID="6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258" Apr 24 21:38:51.010939 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.010922 2570 scope.go:117] "RemoveContainer" containerID="cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127" Apr 24 21:38:51.017074 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.017051 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w"] Apr 24 21:38:51.018850 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.018834 2570 scope.go:117] "RemoveContainer" containerID="386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53" Apr 24 21:38:51.021544 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.021524 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w"] Apr 24 21:38:51.026281 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.026242 2570 scope.go:117] "RemoveContainer" containerID="bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7" Apr 24 21:38:51.026526 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:38:51.026506 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7\": container with ID starting with bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7 not found: ID does not exist" containerID="bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7" Apr 24 21:38:51.026595 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.026534 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7"} err="failed to get container status \"bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7\": rpc error: code = NotFound desc = could not find container \"bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7\": container with ID starting with bac8a7755b33d6851ce5f30eadcdde192bf1a4d57b0e73a58b175252f56659f7 not found: ID does not exist" Apr 24 21:38:51.026595 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.026554 2570 scope.go:117] "RemoveContainer" containerID="6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258" Apr 24 21:38:51.026779 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:38:51.026759 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258\": container with ID starting with 6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258 not found: ID does not exist" containerID="6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258" Apr 24 21:38:51.026840 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.026782 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258"} err="failed to get container status \"6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258\": rpc error: code = NotFound desc = could not find container \"6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258\": container with ID starting with 6e81114b21326c42a618c1ca9f97fae221f1cf58fcf3ed40550626e230a1f258 not found: ID does not exist" Apr 24 21:38:51.026840 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.026797 2570 scope.go:117] "RemoveContainer" containerID="cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127" Apr 24 21:38:51.026999 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:38:51.026984 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127\": container with ID starting with cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127 not found: ID does not exist" containerID="cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127" Apr 24 21:38:51.027072 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.027000 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127"} err="failed to get container status \"cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127\": rpc error: code = NotFound desc = could not find container \"cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127\": container with ID starting with cafc55f3415ac8aee38d2e3edfbe13f419523a00cd41f20ab46bab77a6c80127 not found: ID does not exist" Apr 24 21:38:51.027264 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.027252 2570 scope.go:117] "RemoveContainer" containerID="386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53" Apr 24 21:38:51.027483 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:38:51.027467 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53\": container with ID starting with 386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53 not found: ID does not exist" containerID="386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53" Apr 24 21:38:51.027529 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.027487 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53"} err="failed to get container status \"386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53\": rpc error: code = NotFound desc = could not find container \"386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53\": container with ID starting with 386cb5eccc1a42b848bf80ec544220d20bd10c6de8559ac726406bdf6a389e53 not found: ID does not exist" Apr 24 21:38:51.682873 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:51.682831 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-67598ff486-n584w" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.21:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 24 21:38:52.066629 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:52.066553 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" path="/var/lib/kubelet/pods/b5c6bcc7-6504-4840-8c68-a1f58727fd5a/volumes" Apr 24 21:38:53.927432 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:53.927396 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:38:53.927888 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:38:53.927865 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:03.926705 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:03.926653 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:39:03.927112 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:03.927061 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:13.927170 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:13.927070 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:39:13.927595 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:13.927570 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:23.927088 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:23.927045 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:39:23.927611 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:23.927570 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:33.927192 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:33.927160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:39:33.927593 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:33.927399 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:39:45.754960 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.754927 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d"] Apr 24 21:39:45.755363 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.755269 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" containerID="cri-o://482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78" gracePeriod=30 Apr 24 21:39:45.755363 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.755300 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" containerID="cri-o://645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a" gracePeriod=30 Apr 24 21:39:45.755473 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.755308 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" containerID="cri-o://72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375" gracePeriod=30 Apr 24 21:39:45.839783 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.839747 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf"] Apr 24 21:39:45.840065 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840052 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840068 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840087 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="storage-initializer" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840092 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="storage-initializer" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840100 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840105 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840110 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" Apr 24 21:39:45.840114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840115 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" Apr 24 21:39:45.840334 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840164 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="agent" Apr 24 21:39:45.840334 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840173 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kube-rbac-proxy" Apr 24 21:39:45.840334 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.840180 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c6bcc7-6504-4840-8c68-a1f58727fd5a" containerName="kserve-container" Apr 24 21:39:45.843207 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.843192 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:45.846581 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.846557 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 24 21:39:45.846695 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.846567 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 24 21:39:45.855539 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.855519 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf"] Apr 24 21:39:45.957840 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.957803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b391266-a5e9-4e4f-a458-285d471e2a5e-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:45.958019 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.957850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b391266-a5e9-4e4f-a458-285d471e2a5e-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:45.958019 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:45.957877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfhs\" (UniqueName: \"kubernetes.io/projected/7b391266-a5e9-4e4f-a458-285d471e2a5e-kube-api-access-5mfhs\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.059239 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.059154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b391266-a5e9-4e4f-a458-285d471e2a5e-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.059239 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.059208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b391266-a5e9-4e4f-a458-285d471e2a5e-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.059239 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.059237 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfhs\" (UniqueName: \"kubernetes.io/projected/7b391266-a5e9-4e4f-a458-285d471e2a5e-kube-api-access-5mfhs\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.059923 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.059896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b391266-a5e9-4e4f-a458-285d471e2a5e-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.061796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.061771 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b391266-a5e9-4e4f-a458-285d471e2a5e-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.070006 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.069977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfhs\" (UniqueName: \"kubernetes.io/projected/7b391266-a5e9-4e4f-a458-285d471e2a5e-kube-api-access-5mfhs\") pod \"message-dumper-predictor-c7d86bcbd-dtkmf\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.152191 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.152159 2570 generic.go:358] "Generic (PLEG): container finished" podID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerID="72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375" exitCode=2 Apr 24 21:39:46.152362 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.152232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerDied","Data":"72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375"} Apr 24 21:39:46.152362 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.152322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:46.298459 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.298428 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf"] Apr 24 21:39:46.302846 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:39:46.302820 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b391266_a5e9_4e4f_a458_285d471e2a5e.slice/crio-8b15993d3528e681fb1bde9fd7fb37cf3ace7ac6ce1f715eaf4b5ccd8b8cda3d WatchSource:0}: Error finding container 8b15993d3528e681fb1bde9fd7fb37cf3ace7ac6ce1f715eaf4b5ccd8b8cda3d: Status 404 returned error can't find the container with id 8b15993d3528e681fb1bde9fd7fb37cf3ace7ac6ce1f715eaf4b5ccd8b8cda3d Apr 24 21:39:46.304621 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:46.304603 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:39:47.157603 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:47.157556 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" event={"ID":"7b391266-a5e9-4e4f-a458-285d471e2a5e","Type":"ContainerStarted","Data":"8b15993d3528e681fb1bde9fd7fb37cf3ace7ac6ce1f715eaf4b5ccd8b8cda3d"} Apr 24 21:39:48.162407 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:48.162374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" event={"ID":"7b391266-a5e9-4e4f-a458-285d471e2a5e","Type":"ContainerStarted","Data":"d4f1587512f0bfc53b4cf53e3259915c13497e66907338982d809c1fbcfc3497"} Apr 24 21:39:48.162407 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:48.162410 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" event={"ID":"7b391266-a5e9-4e4f-a458-285d471e2a5e","Type":"ContainerStarted","Data":"001242d0ae53dfb4246d3def8d28b36b1ada8f266df74019a2fd5e262089ea81"} Apr 24 21:39:48.162800 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:48.162512 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:48.187994 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:48.187951 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" podStartSLOduration=2.221737763 podStartE2EDuration="3.187936936s" podCreationTimestamp="2026-04-24 21:39:45 +0000 UTC" firstStartedPulling="2026-04-24 21:39:46.304756781 +0000 UTC m=+738.728264166" lastFinishedPulling="2026-04-24 21:39:47.270955954 +0000 UTC m=+739.694463339" observedRunningTime="2026-04-24 21:39:48.187221405 +0000 UTC m=+740.610728822" watchObservedRunningTime="2026-04-24 21:39:48.187936936 +0000 UTC m=+740.611444342" Apr 24 21:39:48.922919 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:48.922870 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:39:49.165181 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:49.165147 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:49.166802 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:49.166779 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:50.169971 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:50.169941 2570 generic.go:358] "Generic (PLEG): container finished" podID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerID="482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78" exitCode=0 Apr 24 21:39:50.170386 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:50.170043 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerDied","Data":"482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78"} Apr 24 21:39:53.922608 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:53.922559 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:39:53.926794 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:53.926769 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:39:53.927149 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:53.927120 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:56.177059 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:56.177011 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:39:58.922533 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:58.922495 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:39:58.922983 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:39:58.922612 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:40:03.922588 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:03.922541 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:40:03.926878 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:03.926855 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:40:03.927181 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:03.927157 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:05.862562 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.862536 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt"] Apr 24 21:40:05.865402 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.865382 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.867695 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.867673 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 24 21:40:05.867967 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.867944 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 24 21:40:05.877331 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.877309 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt"] Apr 24 21:40:05.898142 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.898123 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07a7d93a-5786-46f3-9a15-c62c97817299-proxy-tls\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.898227 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.898163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07a7d93a-5786-46f3-9a15-c62c97817299-kserve-provision-location\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.898227 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.898185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7cvw\" (UniqueName: \"kubernetes.io/projected/07a7d93a-5786-46f3-9a15-c62c97817299-kube-api-access-p7cvw\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.898314 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.898296 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07a7d93a-5786-46f3-9a15-c62c97817299-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.999202 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.999179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07a7d93a-5786-46f3-9a15-c62c97817299-proxy-tls\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.999340 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.999215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07a7d93a-5786-46f3-9a15-c62c97817299-kserve-provision-location\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.999340 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.999245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7cvw\" (UniqueName: \"kubernetes.io/projected/07a7d93a-5786-46f3-9a15-c62c97817299-kube-api-access-p7cvw\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.999340 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.999300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07a7d93a-5786-46f3-9a15-c62c97817299-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.999647 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.999626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07a7d93a-5786-46f3-9a15-c62c97817299-kserve-provision-location\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:05.999938 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:05.999921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07a7d93a-5786-46f3-9a15-c62c97817299-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:06.001795 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:06.001775 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07a7d93a-5786-46f3-9a15-c62c97817299-proxy-tls\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:06.007798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:06.007781 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7cvw\" (UniqueName: \"kubernetes.io/projected/07a7d93a-5786-46f3-9a15-c62c97817299-kube-api-access-p7cvw\") pod \"isvc-logger-predictor-b9bbfbbdf-w7gmt\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:06.175698 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:06.175673 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:06.298827 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:06.298795 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt"] Apr 24 21:40:06.302037 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:40:06.301982 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a7d93a_5786_46f3_9a15_c62c97817299.slice/crio-4e9cf6777d955b0eda2df739550ccc712f71bd4b3a590d382b62e3a1f971343a WatchSource:0}: Error finding container 4e9cf6777d955b0eda2df739550ccc712f71bd4b3a590d382b62e3a1f971343a: Status 404 returned error can't find the container with id 4e9cf6777d955b0eda2df739550ccc712f71bd4b3a590d382b62e3a1f971343a Apr 24 21:40:07.215675 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:07.215640 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerStarted","Data":"a405d786a1d96ae70ca7ca2a33e9147327e9933cec2e6f6af71f64957e9c4e2f"} Apr 24 21:40:07.215675 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:07.215680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerStarted","Data":"4e9cf6777d955b0eda2df739550ccc712f71bd4b3a590d382b62e3a1f971343a"} Apr 24 21:40:08.922646 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:08.922609 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:40:10.225178 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:10.225145 2570 generic.go:358] "Generic (PLEG): container finished" podID="07a7d93a-5786-46f3-9a15-c62c97817299" containerID="a405d786a1d96ae70ca7ca2a33e9147327e9933cec2e6f6af71f64957e9c4e2f" exitCode=0 Apr 24 21:40:10.225576 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:10.225219 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerDied","Data":"a405d786a1d96ae70ca7ca2a33e9147327e9933cec2e6f6af71f64957e9c4e2f"} Apr 24 21:40:11.229923 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.229891 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerStarted","Data":"2b4d1c7769f5fec7812bcc6b149500f9eea72340068985b255cc57a4c1b77fb9"} Apr 24 21:40:11.229923 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.229931 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerStarted","Data":"0a4e12ca1ad51807ed498c021a746ffcdf11d9042a48c0186e978d7c8c7e0a5c"} Apr 24 21:40:11.230409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.229941 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerStarted","Data":"946c2825aa24977ee278c0b54cf7aec87f78e1a8d6e0314b64f285c51dadf327"} Apr 24 21:40:11.230468 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.230444 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:11.230519 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.230475 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:11.230519 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.230489 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:11.231630 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.231595 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:11.232333 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.232311 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:11.255541 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:11.255491 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podStartSLOduration=6.255476858 podStartE2EDuration="6.255476858s" podCreationTimestamp="2026-04-24 21:40:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:11.255141493 +0000 UTC m=+763.678648897" watchObservedRunningTime="2026-04-24 21:40:11.255476858 +0000 UTC m=+763.678984251" Apr 24 21:40:12.233073 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:12.233010 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:12.233510 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:12.233471 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:13.238434 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.238389 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:13.238825 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.238778 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:13.922552 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.922506 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.22:8643/healthz\": dial tcp 10.133.0.22:8643: connect: connection refused" Apr 24 21:40:13.926789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.926758 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.22:5000: connect: connection refused" Apr 24 21:40:13.926952 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.926936 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:40:13.927154 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.927134 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:13.927245 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:13.927232 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:40:15.897251 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.897224 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:40:15.969349 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969322 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " Apr 24 21:40:15.969471 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969373 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmxhl\" (UniqueName: \"kubernetes.io/projected/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kube-api-access-gmxhl\") pod \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " Apr 24 21:40:15.969471 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969402 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls\") pod \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " Apr 24 21:40:15.969471 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969426 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kserve-provision-location\") pod \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\" (UID: \"44509c85-d6bd-4967-8f7c-7f9cc7096bfb\") " Apr 24 21:40:15.969639 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969557 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "44509c85-d6bd-4967-8f7c-7f9cc7096bfb" (UID: "44509c85-d6bd-4967-8f7c-7f9cc7096bfb"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:40:15.969639 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969638 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:40:15.969816 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.969796 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "44509c85-d6bd-4967-8f7c-7f9cc7096bfb" (UID: "44509c85-d6bd-4967-8f7c-7f9cc7096bfb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:40:15.971703 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.971676 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "44509c85-d6bd-4967-8f7c-7f9cc7096bfb" (UID: "44509c85-d6bd-4967-8f7c-7f9cc7096bfb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:40:15.971703 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:15.971678 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kube-api-access-gmxhl" (OuterVolumeSpecName: "kube-api-access-gmxhl") pod "44509c85-d6bd-4967-8f7c-7f9cc7096bfb" (UID: "44509c85-d6bd-4967-8f7c-7f9cc7096bfb"). InnerVolumeSpecName "kube-api-access-gmxhl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:40:16.070042 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.069964 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmxhl\" (UniqueName: \"kubernetes.io/projected/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kube-api-access-gmxhl\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:40:16.070042 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.069986 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:40:16.070042 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.069995 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44509c85-d6bd-4967-8f7c-7f9cc7096bfb-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:40:16.248140 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.248108 2570 generic.go:358] "Generic (PLEG): container finished" podID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerID="645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a" exitCode=0 Apr 24 21:40:16.248301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.248192 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" Apr 24 21:40:16.248301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.248195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerDied","Data":"645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a"} Apr 24 21:40:16.248301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.248234 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d" event={"ID":"44509c85-d6bd-4967-8f7c-7f9cc7096bfb","Type":"ContainerDied","Data":"f6bd2d48bd01109b5b9538aeee5140fa06287a9c690ac8438a292a18be57ddc7"} Apr 24 21:40:16.248301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.248251 2570 scope.go:117] "RemoveContainer" containerID="645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a" Apr 24 21:40:16.255554 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.255528 2570 scope.go:117] "RemoveContainer" containerID="72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375" Apr 24 21:40:16.262171 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.262155 2570 scope.go:117] "RemoveContainer" containerID="482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78" Apr 24 21:40:16.264965 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.264946 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d"] Apr 24 21:40:16.269192 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.269162 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-55dff46fc9-22l7d"] Apr 24 21:40:16.270049 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.270014 2570 scope.go:117] "RemoveContainer" containerID="c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b" Apr 24 21:40:16.276608 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.276591 2570 scope.go:117] "RemoveContainer" containerID="645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a" Apr 24 21:40:16.276849 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:40:16.276831 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a\": container with ID starting with 645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a not found: ID does not exist" containerID="645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a" Apr 24 21:40:16.276915 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.276860 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a"} err="failed to get container status \"645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a\": rpc error: code = NotFound desc = could not find container \"645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a\": container with ID starting with 645d41d5d0104e13ecd1fde5679f50f5afb4ee2e83d4f37659dfe3c620bd6c6a not found: ID does not exist" Apr 24 21:40:16.276915 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.276884 2570 scope.go:117] "RemoveContainer" containerID="72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375" Apr 24 21:40:16.277131 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:40:16.277115 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375\": container with ID starting with 72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375 not found: ID does not exist" containerID="72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375" Apr 24 21:40:16.277177 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.277138 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375"} err="failed to get container status \"72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375\": rpc error: code = NotFound desc = could not find container \"72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375\": container with ID starting with 72a19bfc9fb23b429f9b9d338f180722097af9d0449ebeb9fc6951c03d932375 not found: ID does not exist" Apr 24 21:40:16.277177 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.277155 2570 scope.go:117] "RemoveContainer" containerID="482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78" Apr 24 21:40:16.277398 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:40:16.277382 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78\": container with ID starting with 482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78 not found: ID does not exist" containerID="482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78" Apr 24 21:40:16.277457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.277405 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78"} err="failed to get container status \"482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78\": rpc error: code = NotFound desc = could not find container \"482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78\": container with ID starting with 482c6420bdf15628bb3a54d2a7f7d08c1440447bc1b1c72bd7ddfc06f09abe78 not found: ID does not exist" Apr 24 21:40:16.277457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.277425 2570 scope.go:117] "RemoveContainer" containerID="c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b" Apr 24 21:40:16.277657 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:40:16.277642 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b\": container with ID starting with c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b not found: ID does not exist" containerID="c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b" Apr 24 21:40:16.277697 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:16.277663 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b"} err="failed to get container status \"c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b\": rpc error: code = NotFound desc = could not find container \"c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b\": container with ID starting with c8ca820c231b51fe54bd49d791c4a7dfdf894c8bee3387b1a07c6f0f44dc5a2b not found: ID does not exist" Apr 24 21:40:18.067966 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:18.067934 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" path="/var/lib/kubelet/pods/44509c85-d6bd-4967-8f7c-7f9cc7096bfb/volumes" Apr 24 21:40:18.242103 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:18.242074 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:40:18.242597 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:18.242569 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:18.242899 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:18.242873 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:28.242969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:28.242923 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:28.243512 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:28.243487 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:38.243403 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:38.243366 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:38.243955 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:38.243787 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:48.242928 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:48.242840 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:48.243444 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:48.243355 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:58.243296 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:58.243253 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:40:58.243754 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:40:58.243645 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:08.242632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:08.242586 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:41:08.243179 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:08.243067 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:18.243156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:18.243124 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:41:18.243578 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:18.243184 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:41:30.875747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:30.875707 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-dtkmf_7b391266-a5e9-4e4f-a458-285d471e2a5e/kserve-container/0.log" Apr 24 21:41:31.089599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.089562 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt"] Apr 24 21:41:31.089995 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.089941 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" containerID="cri-o://946c2825aa24977ee278c0b54cf7aec87f78e1a8d6e0314b64f285c51dadf327" gracePeriod=30 Apr 24 21:41:31.090220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.089953 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" containerID="cri-o://2b4d1c7769f5fec7812bcc6b149500f9eea72340068985b255cc57a4c1b77fb9" gracePeriod=30 Apr 24 21:41:31.090220 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.090058 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" containerID="cri-o://0a4e12ca1ad51807ed498c021a746ffcdf11d9042a48c0186e978d7c8c7e0a5c" gracePeriod=30 Apr 24 21:41:31.147136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147071 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc"] Apr 24 21:41:31.147388 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147376 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" Apr 24 21:41:31.147428 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147391 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" Apr 24 21:41:31.147428 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147401 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" Apr 24 21:41:31.147428 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147408 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" Apr 24 21:41:31.147428 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147422 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="storage-initializer" Apr 24 21:41:31.147545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147432 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="storage-initializer" Apr 24 21:41:31.147545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147441 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" Apr 24 21:41:31.147545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147446 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" Apr 24 21:41:31.147545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147491 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kube-rbac-proxy" Apr 24 21:41:31.147545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147502 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="agent" Apr 24 21:41:31.147545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.147514 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="44509c85-d6bd-4967-8f7c-7f9cc7096bfb" containerName="kserve-container" Apr 24 21:41:31.150527 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.150509 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.154399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.154342 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 21:41:31.155208 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.155189 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 24 21:41:31.163060 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.163016 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc"] Apr 24 21:41:31.221475 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.221447 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26aab879-6046-488f-b60f-e79c92f6b863-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.221600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.221488 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26aab879-6046-488f-b60f-e79c92f6b863-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.221600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.221564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26aab879-6046-488f-b60f-e79c92f6b863-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.221600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.221593 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvn5s\" (UniqueName: \"kubernetes.io/projected/26aab879-6046-488f-b60f-e79c92f6b863-kube-api-access-gvn5s\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.247029 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.247003 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf"] Apr 24 21:41:31.247313 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.247276 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kserve-container" containerID="cri-o://001242d0ae53dfb4246d3def8d28b36b1ada8f266df74019a2fd5e262089ea81" gracePeriod=30 Apr 24 21:41:31.247364 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.247316 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kube-rbac-proxy" containerID="cri-o://d4f1587512f0bfc53b4cf53e3259915c13497e66907338982d809c1fbcfc3497" gracePeriod=30 Apr 24 21:41:31.322538 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.322502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26aab879-6046-488f-b60f-e79c92f6b863-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.322708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.322551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvn5s\" (UniqueName: \"kubernetes.io/projected/26aab879-6046-488f-b60f-e79c92f6b863-kube-api-access-gvn5s\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.322708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.322630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26aab879-6046-488f-b60f-e79c92f6b863-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.322708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.322675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26aab879-6046-488f-b60f-e79c92f6b863-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.323115 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.323087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26aab879-6046-488f-b60f-e79c92f6b863-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.323364 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.323342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26aab879-6046-488f-b60f-e79c92f6b863-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.325384 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.325359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26aab879-6046-488f-b60f-e79c92f6b863-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.332264 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.332240 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvn5s\" (UniqueName: \"kubernetes.io/projected/26aab879-6046-488f-b60f-e79c92f6b863-kube-api-access-gvn5s\") pod \"isvc-lightgbm-predictor-bdf964bd-zdrwc\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.459911 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.459887 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:31.460099 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.460005 2570 generic.go:358] "Generic (PLEG): container finished" podID="07a7d93a-5786-46f3-9a15-c62c97817299" containerID="0a4e12ca1ad51807ed498c021a746ffcdf11d9042a48c0186e978d7c8c7e0a5c" exitCode=2 Apr 24 21:41:31.460099 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.460058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerDied","Data":"0a4e12ca1ad51807ed498c021a746ffcdf11d9042a48c0186e978d7c8c7e0a5c"} Apr 24 21:41:31.461685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.461660 2570 generic.go:358] "Generic (PLEG): container finished" podID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerID="d4f1587512f0bfc53b4cf53e3259915c13497e66907338982d809c1fbcfc3497" exitCode=2 Apr 24 21:41:31.461789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.461685 2570 generic.go:358] "Generic (PLEG): container finished" podID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerID="001242d0ae53dfb4246d3def8d28b36b1ada8f266df74019a2fd5e262089ea81" exitCode=2 Apr 24 21:41:31.461789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.461726 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" event={"ID":"7b391266-a5e9-4e4f-a458-285d471e2a5e","Type":"ContainerDied","Data":"d4f1587512f0bfc53b4cf53e3259915c13497e66907338982d809c1fbcfc3497"} Apr 24 21:41:31.461789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.461747 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" event={"ID":"7b391266-a5e9-4e4f-a458-285d471e2a5e","Type":"ContainerDied","Data":"001242d0ae53dfb4246d3def8d28b36b1ada8f266df74019a2fd5e262089ea81"} Apr 24 21:41:31.482366 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.482348 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:41:31.524535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.524488 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b391266-a5e9-4e4f-a458-285d471e2a5e-proxy-tls\") pod \"7b391266-a5e9-4e4f-a458-285d471e2a5e\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " Apr 24 21:41:31.524535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.524530 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b391266-a5e9-4e4f-a458-285d471e2a5e-message-dumper-kube-rbac-proxy-sar-config\") pod \"7b391266-a5e9-4e4f-a458-285d471e2a5e\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " Apr 24 21:41:31.524876 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.524586 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mfhs\" (UniqueName: \"kubernetes.io/projected/7b391266-a5e9-4e4f-a458-285d471e2a5e-kube-api-access-5mfhs\") pod \"7b391266-a5e9-4e4f-a458-285d471e2a5e\" (UID: \"7b391266-a5e9-4e4f-a458-285d471e2a5e\") " Apr 24 21:41:31.525702 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.525666 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b391266-a5e9-4e4f-a458-285d471e2a5e-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "7b391266-a5e9-4e4f-a458-285d471e2a5e" (UID: "7b391266-a5e9-4e4f-a458-285d471e2a5e"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:41:31.527010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.526980 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b391266-a5e9-4e4f-a458-285d471e2a5e-kube-api-access-5mfhs" (OuterVolumeSpecName: "kube-api-access-5mfhs") pod "7b391266-a5e9-4e4f-a458-285d471e2a5e" (UID: "7b391266-a5e9-4e4f-a458-285d471e2a5e"). InnerVolumeSpecName "kube-api-access-5mfhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:41:31.527526 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.527498 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b391266-a5e9-4e4f-a458-285d471e2a5e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b391266-a5e9-4e4f-a458-285d471e2a5e" (UID: "7b391266-a5e9-4e4f-a458-285d471e2a5e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:41:31.586406 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.586373 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc"] Apr 24 21:41:31.592256 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:41:31.592222 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26aab879_6046_488f_b60f_e79c92f6b863.slice/crio-170cf14d494179ba6eb16aeb797d75c3ca736fc6cd3b8228df0103100f8a3d7e WatchSource:0}: Error finding container 170cf14d494179ba6eb16aeb797d75c3ca736fc6cd3b8228df0103100f8a3d7e: Status 404 returned error can't find the container with id 170cf14d494179ba6eb16aeb797d75c3ca736fc6cd3b8228df0103100f8a3d7e Apr 24 21:41:31.625679 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.625640 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mfhs\" (UniqueName: \"kubernetes.io/projected/7b391266-a5e9-4e4f-a458-285d471e2a5e-kube-api-access-5mfhs\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:41:31.625679 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.625668 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b391266-a5e9-4e4f-a458-285d471e2a5e-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:41:31.625834 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:31.625684 2570 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b391266-a5e9-4e4f-a458-285d471e2a5e-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:41:32.466037 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.465996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerStarted","Data":"15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d"} Apr 24 21:41:32.466485 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.466048 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerStarted","Data":"170cf14d494179ba6eb16aeb797d75c3ca736fc6cd3b8228df0103100f8a3d7e"} Apr 24 21:41:32.467409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.467388 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" event={"ID":"7b391266-a5e9-4e4f-a458-285d471e2a5e","Type":"ContainerDied","Data":"8b15993d3528e681fb1bde9fd7fb37cf3ace7ac6ce1f715eaf4b5ccd8b8cda3d"} Apr 24 21:41:32.467489 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.467425 2570 scope.go:117] "RemoveContainer" containerID="d4f1587512f0bfc53b4cf53e3259915c13497e66907338982d809c1fbcfc3497" Apr 24 21:41:32.467489 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.467436 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf" Apr 24 21:41:32.475191 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.475175 2570 scope.go:117] "RemoveContainer" containerID="001242d0ae53dfb4246d3def8d28b36b1ada8f266df74019a2fd5e262089ea81" Apr 24 21:41:32.497957 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.497927 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf"] Apr 24 21:41:32.501438 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:32.501416 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-dtkmf"] Apr 24 21:41:33.238873 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:33.238833 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:41:34.067191 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:34.067158 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" path="/var/lib/kubelet/pods/7b391266-a5e9-4e4f-a458-285d471e2a5e/volumes" Apr 24 21:41:35.480268 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:35.480239 2570 generic.go:358] "Generic (PLEG): container finished" podID="07a7d93a-5786-46f3-9a15-c62c97817299" containerID="946c2825aa24977ee278c0b54cf7aec87f78e1a8d6e0314b64f285c51dadf327" exitCode=0 Apr 24 21:41:35.480575 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:35.480320 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerDied","Data":"946c2825aa24977ee278c0b54cf7aec87f78e1a8d6e0314b64f285c51dadf327"} Apr 24 21:41:36.484208 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:36.484169 2570 generic.go:358] "Generic (PLEG): container finished" podID="26aab879-6046-488f-b60f-e79c92f6b863" containerID="15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d" exitCode=0 Apr 24 21:41:36.484570 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:36.484241 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerDied","Data":"15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d"} Apr 24 21:41:38.239330 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:38.239290 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:41:38.242676 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:38.242638 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:41:38.243078 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:38.243047 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:43.239407 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:43.239366 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:41:43.239792 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:43.239493 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:41:43.511544 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:43.511448 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerStarted","Data":"c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479"} Apr 24 21:41:43.511544 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:43.511489 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerStarted","Data":"7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8"} Apr 24 21:41:43.511769 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:43.511745 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:43.532252 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:43.532206 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podStartSLOduration=6.377633789 podStartE2EDuration="12.532193063s" podCreationTimestamp="2026-04-24 21:41:31 +0000 UTC" firstStartedPulling="2026-04-24 21:41:36.485488774 +0000 UTC m=+848.908996159" lastFinishedPulling="2026-04-24 21:41:42.640048048 +0000 UTC m=+855.063555433" observedRunningTime="2026-04-24 21:41:43.53070269 +0000 UTC m=+855.954210097" watchObservedRunningTime="2026-04-24 21:41:43.532193063 +0000 UTC m=+855.955700470" Apr 24 21:41:44.514899 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:44.514870 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:44.516074 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:44.516049 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:41:45.518242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:45.518201 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:41:48.238945 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:48.238906 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:41:48.243292 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:48.243258 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:41:48.243633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:48.243607 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:50.527204 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:50.527178 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:41:50.527821 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:50.527791 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:41:53.239289 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:53.239247 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:41:58.238762 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:58.238727 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 24 21:41:58.243105 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:58.243081 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 24 21:41:58.243224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:58.243209 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:41:58.243377 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:58.243351 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:58.243483 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:41:58.243455 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:42:00.528087 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:00.528050 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:42:01.567470 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.567438 2570 generic.go:358] "Generic (PLEG): container finished" podID="07a7d93a-5786-46f3-9a15-c62c97817299" containerID="2b4d1c7769f5fec7812bcc6b149500f9eea72340068985b255cc57a4c1b77fb9" exitCode=0 Apr 24 21:42:01.567830 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.567510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerDied","Data":"2b4d1c7769f5fec7812bcc6b149500f9eea72340068985b255cc57a4c1b77fb9"} Apr 24 21:42:01.733537 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.733512 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:42:01.847124 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.847062 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7cvw\" (UniqueName: \"kubernetes.io/projected/07a7d93a-5786-46f3-9a15-c62c97817299-kube-api-access-p7cvw\") pod \"07a7d93a-5786-46f3-9a15-c62c97817299\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " Apr 24 21:42:01.847124 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.847100 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07a7d93a-5786-46f3-9a15-c62c97817299-kserve-provision-location\") pod \"07a7d93a-5786-46f3-9a15-c62c97817299\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " Apr 24 21:42:01.847327 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.847133 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07a7d93a-5786-46f3-9a15-c62c97817299-isvc-logger-kube-rbac-proxy-sar-config\") pod \"07a7d93a-5786-46f3-9a15-c62c97817299\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " Apr 24 21:42:01.847327 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.847166 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07a7d93a-5786-46f3-9a15-c62c97817299-proxy-tls\") pod \"07a7d93a-5786-46f3-9a15-c62c97817299\" (UID: \"07a7d93a-5786-46f3-9a15-c62c97817299\") " Apr 24 21:42:01.847432 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.847384 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a7d93a-5786-46f3-9a15-c62c97817299-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07a7d93a-5786-46f3-9a15-c62c97817299" (UID: "07a7d93a-5786-46f3-9a15-c62c97817299"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:01.847507 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.847486 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a7d93a-5786-46f3-9a15-c62c97817299-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "07a7d93a-5786-46f3-9a15-c62c97817299" (UID: "07a7d93a-5786-46f3-9a15-c62c97817299"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:42:01.849243 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.849224 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a7d93a-5786-46f3-9a15-c62c97817299-kube-api-access-p7cvw" (OuterVolumeSpecName: "kube-api-access-p7cvw") pod "07a7d93a-5786-46f3-9a15-c62c97817299" (UID: "07a7d93a-5786-46f3-9a15-c62c97817299"). InnerVolumeSpecName "kube-api-access-p7cvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:01.849307 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.849299 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a7d93a-5786-46f3-9a15-c62c97817299-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "07a7d93a-5786-46f3-9a15-c62c97817299" (UID: "07a7d93a-5786-46f3-9a15-c62c97817299"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:01.948516 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.948486 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07a7d93a-5786-46f3-9a15-c62c97817299-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:42:01.948516 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.948509 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07a7d93a-5786-46f3-9a15-c62c97817299-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:42:01.948516 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.948518 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7cvw\" (UniqueName: \"kubernetes.io/projected/07a7d93a-5786-46f3-9a15-c62c97817299-kube-api-access-p7cvw\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:42:01.948708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:01.948528 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07a7d93a-5786-46f3-9a15-c62c97817299-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:42:02.572133 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.572098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" event={"ID":"07a7d93a-5786-46f3-9a15-c62c97817299","Type":"ContainerDied","Data":"4e9cf6777d955b0eda2df739550ccc712f71bd4b3a590d382b62e3a1f971343a"} Apr 24 21:42:02.572133 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.572127 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt" Apr 24 21:42:02.572638 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.572148 2570 scope.go:117] "RemoveContainer" containerID="2b4d1c7769f5fec7812bcc6b149500f9eea72340068985b255cc57a4c1b77fb9" Apr 24 21:42:02.579856 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.579839 2570 scope.go:117] "RemoveContainer" containerID="0a4e12ca1ad51807ed498c021a746ffcdf11d9042a48c0186e978d7c8c7e0a5c" Apr 24 21:42:02.587061 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.587039 2570 scope.go:117] "RemoveContainer" containerID="946c2825aa24977ee278c0b54cf7aec87f78e1a8d6e0314b64f285c51dadf327" Apr 24 21:42:02.590930 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.590909 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt"] Apr 24 21:42:02.598145 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.595619 2570 scope.go:117] "RemoveContainer" containerID="a405d786a1d96ae70ca7ca2a33e9147327e9933cec2e6f6af71f64957e9c4e2f" Apr 24 21:42:02.598145 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:02.595877 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-b9bbfbbdf-w7gmt"] Apr 24 21:42:04.067149 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:04.067117 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" path="/var/lib/kubelet/pods/07a7d93a-5786-46f3-9a15-c62c97817299/volumes" Apr 24 21:42:10.528153 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:10.528113 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:42:20.528683 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:20.528644 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:42:28.025429 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:28.025399 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:42:28.025941 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:28.025601 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:42:30.528568 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:30.528529 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:42:40.528714 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:40.528675 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:42:50.528108 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:50.528067 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 24 21:42:59.064197 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:42:59.064162 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:43:01.271086 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.271053 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc"] Apr 24 21:43:01.271482 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.271403 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" containerID="cri-o://7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8" gracePeriod=30 Apr 24 21:43:01.271482 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.271458 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kube-rbac-proxy" containerID="cri-o://c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479" gracePeriod=30 Apr 24 21:43:01.391315 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391285 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd"] Apr 24 21:43:01.391559 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391547 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391561 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391570 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kserve-container" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391576 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kserve-container" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391585 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391591 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391596 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" Apr 24 21:43:01.391600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391601 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391610 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kube-rbac-proxy" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391615 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kube-rbac-proxy" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391621 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="storage-initializer" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391626 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="storage-initializer" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391671 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="agent" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391678 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kube-rbac-proxy" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391683 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kube-rbac-proxy" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391690 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="07a7d93a-5786-46f3-9a15-c62c97817299" containerName="kserve-container" Apr 24 21:43:01.391828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.391697 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b391266-a5e9-4e4f-a458-285d471e2a5e" containerName="kserve-container" Apr 24 21:43:01.394826 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.394808 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.397163 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.397143 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:43:01.397707 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.397692 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 24 21:43:01.405748 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.405729 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd"] Apr 24 21:43:01.465864 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.465831 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.466015 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.465879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.466015 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.465942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.466015 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.465978 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkqgg\" (UniqueName: \"kubernetes.io/projected/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kube-api-access-jkqgg\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.567180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.567080 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.567180 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.567153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.567392 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.567199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.567392 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.567242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkqgg\" (UniqueName: \"kubernetes.io/projected/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kube-api-access-jkqgg\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.567682 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.567649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.568015 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.567963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.569734 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.569715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.575879 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.575855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkqgg\" (UniqueName: \"kubernetes.io/projected/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kube-api-access-jkqgg\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.705985 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.705951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:01.739228 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.739192 2570 generic.go:358] "Generic (PLEG): container finished" podID="26aab879-6046-488f-b60f-e79c92f6b863" containerID="c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479" exitCode=2 Apr 24 21:43:01.739364 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.739261 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerDied","Data":"c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479"} Apr 24 21:43:01.829799 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:01.829775 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd"] Apr 24 21:43:01.832072 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:43:01.832042 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7bcbab_2484_4105_9eef_f0bea45b9e0d.slice/crio-6f364bbc4fabc2bf62856450c4c21c6e8febee3f2068336e476e8d823f4c187d WatchSource:0}: Error finding container 6f364bbc4fabc2bf62856450c4c21c6e8febee3f2068336e476e8d823f4c187d: Status 404 returned error can't find the container with id 6f364bbc4fabc2bf62856450c4c21c6e8febee3f2068336e476e8d823f4c187d Apr 24 21:43:02.742798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:02.742764 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerStarted","Data":"54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566"} Apr 24 21:43:02.742798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:02.742802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerStarted","Data":"6f364bbc4fabc2bf62856450c4c21c6e8febee3f2068336e476e8d823f4c187d"} Apr 24 21:43:05.519353 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:05.519308 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 24 21:43:05.751837 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:05.751803 2570 generic.go:358] "Generic (PLEG): container finished" podID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerID="54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566" exitCode=0 Apr 24 21:43:05.751983 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:05.751861 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerDied","Data":"54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566"} Apr 24 21:43:06.110907 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.110884 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:43:06.195085 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.195050 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvn5s\" (UniqueName: \"kubernetes.io/projected/26aab879-6046-488f-b60f-e79c92f6b863-kube-api-access-gvn5s\") pod \"26aab879-6046-488f-b60f-e79c92f6b863\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " Apr 24 21:43:06.195242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.195097 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26aab879-6046-488f-b60f-e79c92f6b863-proxy-tls\") pod \"26aab879-6046-488f-b60f-e79c92f6b863\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " Apr 24 21:43:06.195242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.195129 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26aab879-6046-488f-b60f-e79c92f6b863-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"26aab879-6046-488f-b60f-e79c92f6b863\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " Apr 24 21:43:06.195242 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.195184 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26aab879-6046-488f-b60f-e79c92f6b863-kserve-provision-location\") pod \"26aab879-6046-488f-b60f-e79c92f6b863\" (UID: \"26aab879-6046-488f-b60f-e79c92f6b863\") " Apr 24 21:43:06.195534 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.195510 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26aab879-6046-488f-b60f-e79c92f6b863-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "26aab879-6046-488f-b60f-e79c92f6b863" (UID: "26aab879-6046-488f-b60f-e79c92f6b863"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:06.195606 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.195534 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26aab879-6046-488f-b60f-e79c92f6b863-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26aab879-6046-488f-b60f-e79c92f6b863" (UID: "26aab879-6046-488f-b60f-e79c92f6b863"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:06.197344 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.197324 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26aab879-6046-488f-b60f-e79c92f6b863-kube-api-access-gvn5s" (OuterVolumeSpecName: "kube-api-access-gvn5s") pod "26aab879-6046-488f-b60f-e79c92f6b863" (UID: "26aab879-6046-488f-b60f-e79c92f6b863"). InnerVolumeSpecName "kube-api-access-gvn5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:06.197424 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.197374 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26aab879-6046-488f-b60f-e79c92f6b863-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "26aab879-6046-488f-b60f-e79c92f6b863" (UID: "26aab879-6046-488f-b60f-e79c92f6b863"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:06.295889 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.295822 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvn5s\" (UniqueName: \"kubernetes.io/projected/26aab879-6046-488f-b60f-e79c92f6b863-kube-api-access-gvn5s\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:43:06.295889 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.295851 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26aab879-6046-488f-b60f-e79c92f6b863-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:43:06.295889 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.295862 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26aab879-6046-488f-b60f-e79c92f6b863-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:43:06.295889 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.295871 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26aab879-6046-488f-b60f-e79c92f6b863-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:43:06.755881 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.755774 2570 generic.go:358] "Generic (PLEG): container finished" podID="26aab879-6046-488f-b60f-e79c92f6b863" containerID="7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8" exitCode=0 Apr 24 21:43:06.755881 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.755870 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.755871 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerDied","Data":"7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8"} Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.755915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc" event={"ID":"26aab879-6046-488f-b60f-e79c92f6b863","Type":"ContainerDied","Data":"170cf14d494179ba6eb16aeb797d75c3ca736fc6cd3b8228df0103100f8a3d7e"} Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.755934 2570 scope.go:117] "RemoveContainer" containerID="c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479" Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.757871 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerStarted","Data":"b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019"} Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.757893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerStarted","Data":"c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26"} Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.758187 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.764232 2570 scope.go:117] "RemoveContainer" containerID="7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8" Apr 24 21:43:06.772339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.771438 2570 scope.go:117] "RemoveContainer" containerID="15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d" Apr 24 21:43:06.779875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.778247 2570 scope.go:117] "RemoveContainer" containerID="c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479" Apr 24 21:43:06.780339 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:43:06.780320 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479\": container with ID starting with c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479 not found: ID does not exist" containerID="c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479" Apr 24 21:43:06.780423 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.780347 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479"} err="failed to get container status \"c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479\": rpc error: code = NotFound desc = could not find container \"c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479\": container with ID starting with c49a9c09e5f43678fc3b3aa46cffc285ae9bdb9e04cfced09cde8026b8a88479 not found: ID does not exist" Apr 24 21:43:06.780423 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.780364 2570 scope.go:117] "RemoveContainer" containerID="7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8" Apr 24 21:43:06.780609 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:43:06.780588 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8\": container with ID starting with 7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8 not found: ID does not exist" containerID="7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8" Apr 24 21:43:06.780649 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.780618 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8"} err="failed to get container status \"7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8\": rpc error: code = NotFound desc = could not find container \"7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8\": container with ID starting with 7348d74f622a82582e16e5829d82440afb534d50a37e5d76ceee643eb2b55ef8 not found: ID does not exist" Apr 24 21:43:06.780649 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.780639 2570 scope.go:117] "RemoveContainer" containerID="15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d" Apr 24 21:43:06.780875 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:43:06.780858 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d\": container with ID starting with 15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d not found: ID does not exist" containerID="15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d" Apr 24 21:43:06.780928 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.780880 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d"} err="failed to get container status \"15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d\": rpc error: code = NotFound desc = could not find container \"15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d\": container with ID starting with 15028261c2e1cdceb0a4d78a09f1e657fb9e6487e77dfbf22b340f5879c0872d not found: ID does not exist" Apr 24 21:43:06.782887 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.782851 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podStartSLOduration=5.7828376519999996 podStartE2EDuration="5.782837652s" podCreationTimestamp="2026-04-24 21:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:06.78282482 +0000 UTC m=+939.206332242" watchObservedRunningTime="2026-04-24 21:43:06.782837652 +0000 UTC m=+939.206345049" Apr 24 21:43:06.797366 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.797341 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc"] Apr 24 21:43:06.802058 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:06.802039 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-zdrwc"] Apr 24 21:43:07.761657 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:07.761623 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:07.762662 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:07.762634 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:43:08.067158 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:08.067086 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26aab879-6046-488f-b60f-e79c92f6b863" path="/var/lib/kubelet/pods/26aab879-6046-488f-b60f-e79c92f6b863/volumes" Apr 24 21:43:08.763771 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:08.763727 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:43:13.767711 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:13.767679 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:43:13.768332 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:13.768295 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:43:23.768511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:23.768475 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:43:33.768391 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:33.768353 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:43:43.769084 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:43.768985 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:43:53.768814 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:43:53.768771 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:44:03.769179 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:03.769134 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:44:13.768805 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:13.768771 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:44:22.066431 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:22.066400 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:44:31.703816 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.703769 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd"] Apr 24 21:44:31.704294 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.704236 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" containerID="cri-o://c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26" gracePeriod=30 Apr 24 21:44:31.704371 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.704292 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kube-rbac-proxy" containerID="cri-o://b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019" gracePeriod=30 Apr 24 21:44:31.820285 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820254 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh"] Apr 24 21:44:31.820518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820507 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kube-rbac-proxy" Apr 24 21:44:31.820563 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820520 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kube-rbac-proxy" Apr 24 21:44:31.820563 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820533 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" Apr 24 21:44:31.820563 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820540 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" Apr 24 21:44:31.820563 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820554 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="storage-initializer" Apr 24 21:44:31.820563 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820559 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="storage-initializer" Apr 24 21:44:31.820725 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820598 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kserve-container" Apr 24 21:44:31.820725 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.820605 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="26aab879-6046-488f-b60f-e79c92f6b863" containerName="kube-rbac-proxy" Apr 24 21:44:31.823421 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.823403 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:31.825624 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.825603 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 24 21:44:31.825731 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.825606 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:44:31.835259 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.835232 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh"] Apr 24 21:44:31.977575 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.977521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qkpb\" (UniqueName: \"kubernetes.io/projected/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kube-api-access-6qkpb\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:31.977575 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.977571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:31.977700 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.977631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a85bb445-39fd-4e46-a5ea-554ad0cd3599-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:31.977700 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.977657 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:31.990292 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.990269 2570 generic.go:358] "Generic (PLEG): container finished" podID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerID="b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019" exitCode=2 Apr 24 21:44:31.990375 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:31.990301 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerDied","Data":"b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019"} Apr 24 21:44:32.063472 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.063443 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 24 21:44:32.078496 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.078473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.078574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.078523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a85bb445-39fd-4e46-a5ea-554ad0cd3599-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.078574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.078562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.078679 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:44:32.078667 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-serving-cert: secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 24 21:44:32.078727 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:44:32.078718 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls podName:a85bb445-39fd-4e46-a5ea-554ad0cd3599 nodeName:}" failed. No retries permitted until 2026-04-24 21:44:32.578704776 +0000 UTC m=+1025.002212160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls") pod "isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" (UID: "a85bb445-39fd-4e46-a5ea-554ad0cd3599") : secret "isvc-lightgbm-v2-runtime-predictor-serving-cert" not found Apr 24 21:44:32.078789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.078771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qkpb\" (UniqueName: \"kubernetes.io/projected/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kube-api-access-6qkpb\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.078905 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.078887 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.079263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.079246 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a85bb445-39fd-4e46-a5ea-554ad0cd3599-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.095321 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.095298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qkpb\" (UniqueName: \"kubernetes.io/projected/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kube-api-access-6qkpb\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.583002 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.582972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.585379 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.585359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.732906 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.732878 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:44:32.858594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.858568 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh"] Apr 24 21:44:32.860964 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:44:32.860935 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda85bb445_39fd_4e46_a5ea_554ad0cd3599.slice/crio-72cf805f3fcd76e838aea1d06fc049bc233f3455086278009183980fca11744d WatchSource:0}: Error finding container 72cf805f3fcd76e838aea1d06fc049bc233f3455086278009183980fca11744d: Status 404 returned error can't find the container with id 72cf805f3fcd76e838aea1d06fc049bc233f3455086278009183980fca11744d Apr 24 21:44:32.993737 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.993696 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerStarted","Data":"31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518"} Apr 24 21:44:32.993894 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:32.993740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerStarted","Data":"72cf805f3fcd76e838aea1d06fc049bc233f3455086278009183980fca11744d"} Apr 24 21:44:33.764596 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:33.764556 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.26:8643/healthz\": dial tcp 10.133.0.26:8643: connect: connection refused" Apr 24 21:44:36.488550 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.488498 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:44:36.612395 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.612371 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-proxy-tls\") pod \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " Apr 24 21:44:36.612530 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.612438 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kserve-provision-location\") pod \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " Apr 24 21:44:36.612530 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.612463 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " Apr 24 21:44:36.612530 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.612486 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkqgg\" (UniqueName: \"kubernetes.io/projected/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kube-api-access-jkqgg\") pod \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\" (UID: \"0d7bcbab-2484-4105-9eef-f0bea45b9e0d\") " Apr 24 21:44:36.612834 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.612806 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "0d7bcbab-2484-4105-9eef-f0bea45b9e0d" (UID: "0d7bcbab-2484-4105-9eef-f0bea45b9e0d"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:44:36.612950 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.612806 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0d7bcbab-2484-4105-9eef-f0bea45b9e0d" (UID: "0d7bcbab-2484-4105-9eef-f0bea45b9e0d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:44:36.614586 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.614559 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0d7bcbab-2484-4105-9eef-f0bea45b9e0d" (UID: "0d7bcbab-2484-4105-9eef-f0bea45b9e0d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:44:36.614666 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.614601 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kube-api-access-jkqgg" (OuterVolumeSpecName: "kube-api-access-jkqgg") pod "0d7bcbab-2484-4105-9eef-f0bea45b9e0d" (UID: "0d7bcbab-2484-4105-9eef-f0bea45b9e0d"). InnerVolumeSpecName "kube-api-access-jkqgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:44:36.713361 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.713327 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:44:36.713361 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.713360 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:44:36.713492 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.713375 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:44:36.713492 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:36.713391 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkqgg\" (UniqueName: \"kubernetes.io/projected/0d7bcbab-2484-4105-9eef-f0bea45b9e0d-kube-api-access-jkqgg\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:44:37.005292 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.005260 2570 generic.go:358] "Generic (PLEG): container finished" podID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerID="c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26" exitCode=0 Apr 24 21:44:37.005433 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.005337 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" Apr 24 21:44:37.005433 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.005335 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerDied","Data":"c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26"} Apr 24 21:44:37.005554 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.005449 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd" event={"ID":"0d7bcbab-2484-4105-9eef-f0bea45b9e0d","Type":"ContainerDied","Data":"6f364bbc4fabc2bf62856450c4c21c6e8febee3f2068336e476e8d823f4c187d"} Apr 24 21:44:37.005554 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.005479 2570 scope.go:117] "RemoveContainer" containerID="b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019" Apr 24 21:44:37.006845 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.006821 2570 generic.go:358] "Generic (PLEG): container finished" podID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerID="31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518" exitCode=0 Apr 24 21:44:37.006955 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.006894 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerDied","Data":"31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518"} Apr 24 21:44:37.014003 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.013985 2570 scope.go:117] "RemoveContainer" containerID="c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26" Apr 24 21:44:37.020747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.020731 2570 scope.go:117] "RemoveContainer" containerID="54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566" Apr 24 21:44:37.027202 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.027179 2570 scope.go:117] "RemoveContainer" containerID="b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019" Apr 24 21:44:37.027450 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:44:37.027433 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019\": container with ID starting with b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019 not found: ID does not exist" containerID="b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019" Apr 24 21:44:37.027499 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.027458 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019"} err="failed to get container status \"b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019\": rpc error: code = NotFound desc = could not find container \"b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019\": container with ID starting with b31a3b0a4f34e17ea66ecee164cd2e1c2cf7750c102eff82bc8d9d865f60a019 not found: ID does not exist" Apr 24 21:44:37.027499 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.027475 2570 scope.go:117] "RemoveContainer" containerID="c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26" Apr 24 21:44:37.027686 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:44:37.027670 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26\": container with ID starting with c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26 not found: ID does not exist" containerID="c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26" Apr 24 21:44:37.027727 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.027691 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26"} err="failed to get container status \"c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26\": rpc error: code = NotFound desc = could not find container \"c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26\": container with ID starting with c02a69f3529b1aaf73852c149a4fd1c2901d26847911320dd5115e138ea8ae26 not found: ID does not exist" Apr 24 21:44:37.027727 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.027706 2570 scope.go:117] "RemoveContainer" containerID="54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566" Apr 24 21:44:37.027909 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:44:37.027893 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566\": container with ID starting with 54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566 not found: ID does not exist" containerID="54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566" Apr 24 21:44:37.027959 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.027917 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566"} err="failed to get container status \"54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566\": rpc error: code = NotFound desc = could not find container \"54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566\": container with ID starting with 54523c612c4d24cc142e300420d8d052a016b1cef2c8e931d9a1d860da0f3566 not found: ID does not exist" Apr 24 21:44:37.040535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.040513 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd"] Apr 24 21:44:37.044293 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:37.044275 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-qdttd"] Apr 24 21:44:38.068665 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:44:38.068630 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" path="/var/lib/kubelet/pods/0d7bcbab-2484-4105-9eef-f0bea45b9e0d/volumes" Apr 24 21:46:36.912365 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:36.912345 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:46:37.374184 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:37.374146 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerStarted","Data":"8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca"} Apr 24 21:46:37.374184 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:37.374181 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerStarted","Data":"698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14"} Apr 24 21:46:37.374443 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:37.374272 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:46:37.408319 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:37.408262 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" podStartSLOduration=6.626134556 podStartE2EDuration="2m6.408244283s" podCreationTimestamp="2026-04-24 21:44:31 +0000 UTC" firstStartedPulling="2026-04-24 21:44:37.007861935 +0000 UTC m=+1029.431369320" lastFinishedPulling="2026-04-24 21:46:36.78997166 +0000 UTC m=+1149.213479047" observedRunningTime="2026-04-24 21:46:37.405814413 +0000 UTC m=+1149.829321819" watchObservedRunningTime="2026-04-24 21:46:37.408244283 +0000 UTC m=+1149.831751689" Apr 24 21:46:38.377556 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:38.377528 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:46:44.387113 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:46:44.387019 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:47:14.390462 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:14.390433 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:47:21.986355 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:21.986319 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh"] Apr 24 21:47:21.986907 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:21.986733 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kserve-container" containerID="cri-o://698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14" gracePeriod=30 Apr 24 21:47:21.986907 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:21.986782 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kube-rbac-proxy" containerID="cri-o://8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca" gracePeriod=30 Apr 24 21:47:22.091789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.091749 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q"] Apr 24 21:47:22.092111 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092094 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kube-rbac-proxy" Apr 24 21:47:22.092203 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092113 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kube-rbac-proxy" Apr 24 21:47:22.092203 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092128 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="storage-initializer" Apr 24 21:47:22.092203 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092136 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="storage-initializer" Apr 24 21:47:22.092203 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092158 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" Apr 24 21:47:22.092203 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092167 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" Apr 24 21:47:22.092437 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092245 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kserve-container" Apr 24 21:47:22.092437 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.092259 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d7bcbab-2484-4105-9eef-f0bea45b9e0d" containerName="kube-rbac-proxy" Apr 24 21:47:22.095530 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.095503 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.097863 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.097838 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 24 21:47:22.097985 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.097864 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:47:22.105583 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.105557 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q"] Apr 24 21:47:22.202457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.202416 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.202457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.202456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.202668 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.202498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvsn9\" (UniqueName: \"kubernetes.io/projected/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kube-api-access-nvsn9\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.202668 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.202525 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.303301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.303218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvsn9\" (UniqueName: \"kubernetes.io/projected/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kube-api-access-nvsn9\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.303301 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.303268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.303480 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.303326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.303480 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.303351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.303750 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.303732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.303968 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.303947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.306010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.305985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.313138 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.313119 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvsn9\" (UniqueName: \"kubernetes.io/projected/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kube-api-access-nvsn9\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.406450 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.406419 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:22.497925 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.497861 2570 generic.go:358] "Generic (PLEG): container finished" podID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerID="8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca" exitCode=2 Apr 24 21:47:22.497925 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.497907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerDied","Data":"8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca"} Apr 24 21:47:22.526724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:22.526694 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q"] Apr 24 21:47:22.529674 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:47:22.529644 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a584ea_05ac_42e0_b9a9_0fc4310de4fa.slice/crio-cfbc5a2da7f6310ead1dbaa7c7a1861fed813132a5a7877b97092e70c6d97d44 WatchSource:0}: Error finding container cfbc5a2da7f6310ead1dbaa7c7a1861fed813132a5a7877b97092e70c6d97d44: Status 404 returned error can't find the container with id cfbc5a2da7f6310ead1dbaa7c7a1861fed813132a5a7877b97092e70c6d97d44 Apr 24 21:47:23.033098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.033078 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:47:23.110133 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.110006 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls\") pod \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " Apr 24 21:47:23.110133 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.110109 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a85bb445-39fd-4e46-a5ea-554ad0cd3599-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " Apr 24 21:47:23.110352 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.110156 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kserve-provision-location\") pod \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " Apr 24 21:47:23.110352 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.110188 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qkpb\" (UniqueName: \"kubernetes.io/projected/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kube-api-access-6qkpb\") pod \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\" (UID: \"a85bb445-39fd-4e46-a5ea-554ad0cd3599\") " Apr 24 21:47:23.110483 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.110456 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a85bb445-39fd-4e46-a5ea-554ad0cd3599-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "a85bb445-39fd-4e46-a5ea-554ad0cd3599" (UID: "a85bb445-39fd-4e46-a5ea-554ad0cd3599"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:23.110483 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.110467 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a85bb445-39fd-4e46-a5ea-554ad0cd3599" (UID: "a85bb445-39fd-4e46-a5ea-554ad0cd3599"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:23.112245 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.112223 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a85bb445-39fd-4e46-a5ea-554ad0cd3599" (UID: "a85bb445-39fd-4e46-a5ea-554ad0cd3599"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:23.112343 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.112319 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kube-api-access-6qkpb" (OuterVolumeSpecName: "kube-api-access-6qkpb") pod "a85bb445-39fd-4e46-a5ea-554ad0cd3599" (UID: "a85bb445-39fd-4e46-a5ea-554ad0cd3599"). InnerVolumeSpecName "kube-api-access-6qkpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:23.210925 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.210886 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6qkpb\" (UniqueName: \"kubernetes.io/projected/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kube-api-access-6qkpb\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:23.210925 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.210921 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a85bb445-39fd-4e46-a5ea-554ad0cd3599-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:23.210925 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.210931 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a85bb445-39fd-4e46-a5ea-554ad0cd3599-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:23.211134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.210941 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a85bb445-39fd-4e46-a5ea-554ad0cd3599-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:23.502188 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.502157 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerStarted","Data":"a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e"} Apr 24 21:47:23.502188 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.502190 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerStarted","Data":"cfbc5a2da7f6310ead1dbaa7c7a1861fed813132a5a7877b97092e70c6d97d44"} Apr 24 21:47:23.503669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.503645 2570 generic.go:358] "Generic (PLEG): container finished" podID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerID="698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14" exitCode=0 Apr 24 21:47:23.503793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.503715 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerDied","Data":"698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14"} Apr 24 21:47:23.503793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.503721 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" Apr 24 21:47:23.503793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.503735 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh" event={"ID":"a85bb445-39fd-4e46-a5ea-554ad0cd3599","Type":"ContainerDied","Data":"72cf805f3fcd76e838aea1d06fc049bc233f3455086278009183980fca11744d"} Apr 24 21:47:23.503793 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.503749 2570 scope.go:117] "RemoveContainer" containerID="8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca" Apr 24 21:47:23.511698 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.511680 2570 scope.go:117] "RemoveContainer" containerID="698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14" Apr 24 21:47:23.518518 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.518504 2570 scope.go:117] "RemoveContainer" containerID="31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518" Apr 24 21:47:23.525462 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.525448 2570 scope.go:117] "RemoveContainer" containerID="8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca" Apr 24 21:47:23.525688 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:47:23.525670 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca\": container with ID starting with 8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca not found: ID does not exist" containerID="8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca" Apr 24 21:47:23.525760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.525702 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca"} err="failed to get container status \"8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca\": rpc error: code = NotFound desc = could not find container \"8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca\": container with ID starting with 8f1e4e2c2233dfd02d4ba6cebc0dde2080db7fc3d59131e49697a089674981ca not found: ID does not exist" Apr 24 21:47:23.525760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.525727 2570 scope.go:117] "RemoveContainer" containerID="698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14" Apr 24 21:47:23.525940 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:47:23.525923 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14\": container with ID starting with 698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14 not found: ID does not exist" containerID="698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14" Apr 24 21:47:23.525977 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.525947 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14"} err="failed to get container status \"698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14\": rpc error: code = NotFound desc = could not find container \"698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14\": container with ID starting with 698bcfc8d2b0397cc94b84639cc53eed7d648195a7c64c196ed5b9c8dd486d14 not found: ID does not exist" Apr 24 21:47:23.526043 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.525963 2570 scope.go:117] "RemoveContainer" containerID="31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518" Apr 24 21:47:23.526267 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:47:23.526252 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518\": container with ID starting with 31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518 not found: ID does not exist" containerID="31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518" Apr 24 21:47:23.526308 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.526270 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518"} err="failed to get container status \"31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518\": rpc error: code = NotFound desc = could not find container \"31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518\": container with ID starting with 31c8425494c0def3ecb6cb31c05831e24ce7339f85eb1ece1b224794d44c2518 not found: ID does not exist" Apr 24 21:47:23.533511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.533488 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh"] Apr 24 21:47:23.536956 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:23.536936 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-6jjzh"] Apr 24 21:47:24.066763 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:24.066732 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" path="/var/lib/kubelet/pods/a85bb445-39fd-4e46-a5ea-554ad0cd3599/volumes" Apr 24 21:47:27.522406 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:27.522375 2570 generic.go:358] "Generic (PLEG): container finished" podID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerID="a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e" exitCode=0 Apr 24 21:47:27.522828 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:27.522445 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerDied","Data":"a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e"} Apr 24 21:47:28.045918 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.045893 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:47:28.047836 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.047815 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:47:28.527037 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.526994 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerStarted","Data":"ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88"} Apr 24 21:47:28.527412 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.527057 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerStarted","Data":"3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12"} Apr 24 21:47:28.527412 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.527333 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:28.527493 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.527470 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:28.528848 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.528822 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:47:28.546352 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:28.546303 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" podStartSLOduration=6.546288986 podStartE2EDuration="6.546288986s" podCreationTimestamp="2026-04-24 21:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:28.544586824 +0000 UTC m=+1200.968094235" watchObservedRunningTime="2026-04-24 21:47:28.546288986 +0000 UTC m=+1200.969796393" Apr 24 21:47:29.530321 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:29.530286 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 24 21:47:34.534696 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:34.534668 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:34.536002 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:34.535979 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:42.113885 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.113854 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q"] Apr 24 21:47:42.114339 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.114304 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kserve-container" containerID="cri-o://3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12" gracePeriod=30 Apr 24 21:47:42.114402 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.114334 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kube-rbac-proxy" containerID="cri-o://ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88" gracePeriod=30 Apr 24 21:47:42.178776 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.178739 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7"] Apr 24 21:47:42.179082 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179069 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kube-rbac-proxy" Apr 24 21:47:42.179134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179086 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kube-rbac-proxy" Apr 24 21:47:42.179134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179096 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kserve-container" Apr 24 21:47:42.179134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179102 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kserve-container" Apr 24 21:47:42.179134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179110 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="storage-initializer" Apr 24 21:47:42.179134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179116 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="storage-initializer" Apr 24 21:47:42.179329 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179189 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kserve-container" Apr 24 21:47:42.179329 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.179202 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a85bb445-39fd-4e46-a5ea-554ad0cd3599" containerName="kube-rbac-proxy" Apr 24 21:47:42.182185 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.182168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.184350 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.184311 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:47:42.184440 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.184386 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 24 21:47:42.191177 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.191156 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7"] Apr 24 21:47:42.246204 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.246160 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5r6\" (UniqueName: \"kubernetes.io/projected/71779215-5d0e-403a-ac2e-1b31342ae166-kube-api-access-vn5r6\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.246372 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.246219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71779215-5d0e-403a-ac2e-1b31342ae166-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.246372 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.246294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71779215-5d0e-403a-ac2e-1b31342ae166-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.246372 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.246343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71779215-5d0e-403a-ac2e-1b31342ae166-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.347327 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.347288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71779215-5d0e-403a-ac2e-1b31342ae166-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.347511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.347337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71779215-5d0e-403a-ac2e-1b31342ae166-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.347511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.347371 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71779215-5d0e-403a-ac2e-1b31342ae166-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.347511 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.347425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5r6\" (UniqueName: \"kubernetes.io/projected/71779215-5d0e-403a-ac2e-1b31342ae166-kube-api-access-vn5r6\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.347859 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.347838 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71779215-5d0e-403a-ac2e-1b31342ae166-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.348080 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.348062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71779215-5d0e-403a-ac2e-1b31342ae166-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.349813 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.349794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71779215-5d0e-403a-ac2e-1b31342ae166-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.356376 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.356359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5r6\" (UniqueName: \"kubernetes.io/projected/71779215-5d0e-403a-ac2e-1b31342ae166-kube-api-access-vn5r6\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.492950 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.492909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:42.568742 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.568702 2570 generic.go:358] "Generic (PLEG): container finished" podID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerID="ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88" exitCode=2 Apr 24 21:47:42.568900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.568767 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerDied","Data":"ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88"} Apr 24 21:47:42.618915 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.618880 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7"] Apr 24 21:47:42.623337 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:47:42.623309 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71779215_5d0e_403a_ac2e_1b31342ae166.slice/crio-b883ab82be832a78e174fd2254a577046a7b702ff78b295c0726097666567930 WatchSource:0}: Error finding container b883ab82be832a78e174fd2254a577046a7b702ff78b295c0726097666567930: Status 404 returned error can't find the container with id b883ab82be832a78e174fd2254a577046a7b702ff78b295c0726097666567930 Apr 24 21:47:42.840633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.840611 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:42.951585 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.951551 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kserve-provision-location\") pod \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " Apr 24 21:47:42.951747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.951593 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvsn9\" (UniqueName: \"kubernetes.io/projected/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kube-api-access-nvsn9\") pod \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " Apr 24 21:47:42.951747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.951617 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-proxy-tls\") pod \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " Apr 24 21:47:42.951747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.951646 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\" (UID: \"67a584ea-05ac-42e0-b9a9-0fc4310de4fa\") " Apr 24 21:47:42.952001 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.951964 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "67a584ea-05ac-42e0-b9a9-0fc4310de4fa" (UID: "67a584ea-05ac-42e0-b9a9-0fc4310de4fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:42.952001 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.951997 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "67a584ea-05ac-42e0-b9a9-0fc4310de4fa" (UID: "67a584ea-05ac-42e0-b9a9-0fc4310de4fa"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:42.953782 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.953762 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kube-api-access-nvsn9" (OuterVolumeSpecName: "kube-api-access-nvsn9") pod "67a584ea-05ac-42e0-b9a9-0fc4310de4fa" (UID: "67a584ea-05ac-42e0-b9a9-0fc4310de4fa"). InnerVolumeSpecName "kube-api-access-nvsn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:42.953859 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:42.953781 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "67a584ea-05ac-42e0-b9a9-0fc4310de4fa" (UID: "67a584ea-05ac-42e0-b9a9-0fc4310de4fa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:43.052896 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.052818 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nvsn9\" (UniqueName: \"kubernetes.io/projected/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kube-api-access-nvsn9\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:43.052896 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.052847 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:43.052896 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.052857 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:43.052896 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.052867 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a584ea-05ac-42e0-b9a9-0fc4310de4fa-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:47:43.573469 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.573432 2570 generic.go:358] "Generic (PLEG): container finished" podID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerID="3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12" exitCode=0 Apr 24 21:47:43.573898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.573518 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerDied","Data":"3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12"} Apr 24 21:47:43.573898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.573551 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" Apr 24 21:47:43.573898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.573561 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q" event={"ID":"67a584ea-05ac-42e0-b9a9-0fc4310de4fa","Type":"ContainerDied","Data":"cfbc5a2da7f6310ead1dbaa7c7a1861fed813132a5a7877b97092e70c6d97d44"} Apr 24 21:47:43.573898 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.573578 2570 scope.go:117] "RemoveContainer" containerID="ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88" Apr 24 21:47:43.574848 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.574827 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerStarted","Data":"accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e"} Apr 24 21:47:43.574915 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.574859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerStarted","Data":"b883ab82be832a78e174fd2254a577046a7b702ff78b295c0726097666567930"} Apr 24 21:47:43.582409 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.582357 2570 scope.go:117] "RemoveContainer" containerID="3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12" Apr 24 21:47:43.589147 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.589128 2570 scope.go:117] "RemoveContainer" containerID="a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e" Apr 24 21:47:43.595732 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.595715 2570 scope.go:117] "RemoveContainer" containerID="ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88" Apr 24 21:47:43.595967 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:47:43.595950 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88\": container with ID starting with ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88 not found: ID does not exist" containerID="ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88" Apr 24 21:47:43.596010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.595976 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88"} err="failed to get container status \"ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88\": rpc error: code = NotFound desc = could not find container \"ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88\": container with ID starting with ebde8a2b2020c63d0a6b5548d61697ed9c86f46133c9c192e65dfaf33fb20b88 not found: ID does not exist" Apr 24 21:47:43.596010 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.595992 2570 scope.go:117] "RemoveContainer" containerID="3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12" Apr 24 21:47:43.596184 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:47:43.596170 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12\": container with ID starting with 3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12 not found: ID does not exist" containerID="3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12" Apr 24 21:47:43.596219 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.596190 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12"} err="failed to get container status \"3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12\": rpc error: code = NotFound desc = could not find container \"3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12\": container with ID starting with 3aefff44e560728be6ddb8ca85393de3af1a79be016d5517d104666c90b38a12 not found: ID does not exist" Apr 24 21:47:43.596219 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.596203 2570 scope.go:117] "RemoveContainer" containerID="a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e" Apr 24 21:47:43.596383 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:47:43.596366 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e\": container with ID starting with a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e not found: ID does not exist" containerID="a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e" Apr 24 21:47:43.596435 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.596392 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e"} err="failed to get container status \"a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e\": rpc error: code = NotFound desc = could not find container \"a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e\": container with ID starting with a00768e11e49b3fa13de225ab535f6b8ad5d46f870cd11de81495941f17ba91e not found: ID does not exist" Apr 24 21:47:43.610923 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.610898 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q"] Apr 24 21:47:43.615179 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:43.615159 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-l9c2q"] Apr 24 21:47:44.067669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:44.067635 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" path="/var/lib/kubelet/pods/67a584ea-05ac-42e0-b9a9-0fc4310de4fa/volumes" Apr 24 21:47:47.587319 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:47.587287 2570 generic.go:358] "Generic (PLEG): container finished" podID="71779215-5d0e-403a-ac2e-1b31342ae166" containerID="accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e" exitCode=0 Apr 24 21:47:47.587675 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:47.587350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerDied","Data":"accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e"} Apr 24 21:47:48.592702 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:48.592664 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerStarted","Data":"5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1"} Apr 24 21:47:48.592702 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:48.592703 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerStarted","Data":"809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222"} Apr 24 21:47:48.593246 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:48.592929 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:48.593246 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:48.592986 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:47:48.613993 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:48.613949 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" podStartSLOduration=6.613936822 podStartE2EDuration="6.613936822s" podCreationTimestamp="2026-04-24 21:47:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:48.613009525 +0000 UTC m=+1221.036516938" watchObservedRunningTime="2026-04-24 21:47:48.613936822 +0000 UTC m=+1221.037444231" Apr 24 21:47:54.600963 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:47:54.600936 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:48:24.605126 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:24.605050 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:48:32.265214 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.265180 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7"] Apr 24 21:48:32.265626 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.265557 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kserve-container" containerID="cri-o://809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222" gracePeriod=30 Apr 24 21:48:32.265782 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.265727 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kube-rbac-proxy" containerID="cri-o://5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1" gracePeriod=30 Apr 24 21:48:32.346968 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.346925 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh"] Apr 24 21:48:32.347256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347235 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="storage-initializer" Apr 24 21:48:32.347256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347251 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="storage-initializer" Apr 24 21:48:32.347256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347260 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kube-rbac-proxy" Apr 24 21:48:32.347420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347269 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kube-rbac-proxy" Apr 24 21:48:32.347420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347280 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kserve-container" Apr 24 21:48:32.347420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347287 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kserve-container" Apr 24 21:48:32.347420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347342 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kube-rbac-proxy" Apr 24 21:48:32.347420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.347358 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a584ea-05ac-42e0-b9a9-0fc4310de4fa" containerName="kserve-container" Apr 24 21:48:32.351881 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.351863 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.354000 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.353978 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 24 21:48:32.354100 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.354001 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 24 21:48:32.361784 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.361761 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh"] Apr 24 21:48:32.499811 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.499781 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.499985 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.499836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.499985 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.499865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/578f12f9-dc65-478d-bdaa-ce53f23dfda6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.499985 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.499884 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vwgd\" (UniqueName: \"kubernetes.io/projected/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kube-api-access-5vwgd\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.600466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.600383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.600466 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.600436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/578f12f9-dc65-478d-bdaa-ce53f23dfda6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.600691 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.600469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vwgd\" (UniqueName: \"kubernetes.io/projected/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kube-api-access-5vwgd\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.600691 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.600512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.600691 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:48:32.600632 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 24 21:48:32.600829 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:48:32.600698 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls podName:578f12f9-dc65-478d-bdaa-ce53f23dfda6 nodeName:}" failed. No retries permitted until 2026-04-24 21:48:33.100676263 +0000 UTC m=+1265.524183653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls") pod "isvc-sklearn-mcp-predictor-57d594db69-zp9mh" (UID: "578f12f9-dc65-478d-bdaa-ce53f23dfda6") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 24 21:48:32.600829 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.600818 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.601127 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.601109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/578f12f9-dc65-478d-bdaa-ce53f23dfda6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.608933 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.608909 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vwgd\" (UniqueName: \"kubernetes.io/projected/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kube-api-access-5vwgd\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:32.714232 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.714199 2570 generic.go:358] "Generic (PLEG): container finished" podID="71779215-5d0e-403a-ac2e-1b31342ae166" containerID="5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1" exitCode=2 Apr 24 21:48:32.714375 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:32.714263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerDied","Data":"5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1"} Apr 24 21:48:33.105756 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.105715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:33.108216 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.108193 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-57d594db69-zp9mh\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:33.262086 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.262063 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:33.399680 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.399652 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh"] Apr 24 21:48:33.402925 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:48:33.402897 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578f12f9_dc65_478d_bdaa_ce53f23dfda6.slice/crio-e79050571ee3980f97d5e8cb510a906ce0628a252246cdba95969a760875d200 WatchSource:0}: Error finding container e79050571ee3980f97d5e8cb510a906ce0628a252246cdba95969a760875d200: Status 404 returned error can't find the container with id e79050571ee3980f97d5e8cb510a906ce0628a252246cdba95969a760875d200 Apr 24 21:48:33.415724 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.415704 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:48:33.508440 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.508417 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71779215-5d0e-403a-ac2e-1b31342ae166-kserve-provision-location\") pod \"71779215-5d0e-403a-ac2e-1b31342ae166\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " Apr 24 21:48:33.508566 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.508457 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71779215-5d0e-403a-ac2e-1b31342ae166-proxy-tls\") pod \"71779215-5d0e-403a-ac2e-1b31342ae166\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " Apr 24 21:48:33.508566 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.508490 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71779215-5d0e-403a-ac2e-1b31342ae166-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"71779215-5d0e-403a-ac2e-1b31342ae166\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " Apr 24 21:48:33.508668 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.508570 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn5r6\" (UniqueName: \"kubernetes.io/projected/71779215-5d0e-403a-ac2e-1b31342ae166-kube-api-access-vn5r6\") pod \"71779215-5d0e-403a-ac2e-1b31342ae166\" (UID: \"71779215-5d0e-403a-ac2e-1b31342ae166\") " Apr 24 21:48:33.508809 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.508788 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71779215-5d0e-403a-ac2e-1b31342ae166-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "71779215-5d0e-403a-ac2e-1b31342ae166" (UID: "71779215-5d0e-403a-ac2e-1b31342ae166"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:33.508886 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.508829 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71779215-5d0e-403a-ac2e-1b31342ae166-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71779215-5d0e-403a-ac2e-1b31342ae166" (UID: "71779215-5d0e-403a-ac2e-1b31342ae166"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:33.510473 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.510454 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71779215-5d0e-403a-ac2e-1b31342ae166-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "71779215-5d0e-403a-ac2e-1b31342ae166" (UID: "71779215-5d0e-403a-ac2e-1b31342ae166"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:33.510614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.510595 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71779215-5d0e-403a-ac2e-1b31342ae166-kube-api-access-vn5r6" (OuterVolumeSpecName: "kube-api-access-vn5r6") pod "71779215-5d0e-403a-ac2e-1b31342ae166" (UID: "71779215-5d0e-403a-ac2e-1b31342ae166"). InnerVolumeSpecName "kube-api-access-vn5r6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:33.610076 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.610038 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71779215-5d0e-403a-ac2e-1b31342ae166-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:48:33.610076 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.610078 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71779215-5d0e-403a-ac2e-1b31342ae166-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:48:33.610236 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.610092 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/71779215-5d0e-403a-ac2e-1b31342ae166-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:48:33.610236 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.610105 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vn5r6\" (UniqueName: \"kubernetes.io/projected/71779215-5d0e-403a-ac2e-1b31342ae166-kube-api-access-vn5r6\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:48:33.719065 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.719016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerStarted","Data":"2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd"} Apr 24 21:48:33.719065 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.719067 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerStarted","Data":"e79050571ee3980f97d5e8cb510a906ce0628a252246cdba95969a760875d200"} Apr 24 21:48:33.720667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.720641 2570 generic.go:358] "Generic (PLEG): container finished" podID="71779215-5d0e-403a-ac2e-1b31342ae166" containerID="809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222" exitCode=0 Apr 24 21:48:33.720791 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.720699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerDied","Data":"809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222"} Apr 24 21:48:33.720791 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.720729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" event={"ID":"71779215-5d0e-403a-ac2e-1b31342ae166","Type":"ContainerDied","Data":"b883ab82be832a78e174fd2254a577046a7b702ff78b295c0726097666567930"} Apr 24 21:48:33.720791 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.720736 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7" Apr 24 21:48:33.720791 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.720748 2570 scope.go:117] "RemoveContainer" containerID="5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1" Apr 24 21:48:33.728516 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.728500 2570 scope.go:117] "RemoveContainer" containerID="809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222" Apr 24 21:48:33.735668 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.735651 2570 scope.go:117] "RemoveContainer" containerID="accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e" Apr 24 21:48:33.742864 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.742847 2570 scope.go:117] "RemoveContainer" containerID="5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1" Apr 24 21:48:33.743135 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:48:33.743117 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1\": container with ID starting with 5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1 not found: ID does not exist" containerID="5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1" Apr 24 21:48:33.743186 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.743141 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1"} err="failed to get container status \"5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1\": rpc error: code = NotFound desc = could not find container \"5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1\": container with ID starting with 5cb3ffbd290d441a86e19673852bb1b0b2db509b029d65c6d4a5420e2b5da9b1 not found: ID does not exist" Apr 24 21:48:33.743186 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.743157 2570 scope.go:117] "RemoveContainer" containerID="809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222" Apr 24 21:48:33.743388 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:48:33.743371 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222\": container with ID starting with 809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222 not found: ID does not exist" containerID="809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222" Apr 24 21:48:33.743430 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.743395 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222"} err="failed to get container status \"809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222\": rpc error: code = NotFound desc = could not find container \"809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222\": container with ID starting with 809eb41d14540cb4a47ba634fd1b37c6e5270d609743c4563de7e5f45612f222 not found: ID does not exist" Apr 24 21:48:33.743430 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.743411 2570 scope.go:117] "RemoveContainer" containerID="accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e" Apr 24 21:48:33.743664 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:48:33.743640 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e\": container with ID starting with accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e not found: ID does not exist" containerID="accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e" Apr 24 21:48:33.743758 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.743670 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e"} err="failed to get container status \"accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e\": rpc error: code = NotFound desc = could not find container \"accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e\": container with ID starting with accb63a222e9adf93b78297dd503694d04360a5e4d6289ad2af5a349d807ab5e not found: ID does not exist" Apr 24 21:48:33.750546 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.750529 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7"] Apr 24 21:48:33.753767 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:33.753749 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-h5mf7"] Apr 24 21:48:34.067443 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:34.067354 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" path="/var/lib/kubelet/pods/71779215-5d0e-403a-ac2e-1b31342ae166/volumes" Apr 24 21:48:37.736175 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:37.736145 2570 generic.go:358] "Generic (PLEG): container finished" podID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerID="2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd" exitCode=0 Apr 24 21:48:37.736529 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:37.736230 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerDied","Data":"2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd"} Apr 24 21:48:38.740949 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:38.740915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerStarted","Data":"73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922"} Apr 24 21:48:39.745538 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:39.745453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerStarted","Data":"a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317"} Apr 24 21:48:39.745538 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:39.745486 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerStarted","Data":"6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378"} Apr 24 21:48:39.746013 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:39.745577 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:39.767411 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:39.767370 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podStartSLOduration=6.064513106 podStartE2EDuration="7.767356788s" podCreationTimestamp="2026-04-24 21:48:32 +0000 UTC" firstStartedPulling="2026-04-24 21:48:37.791879546 +0000 UTC m=+1270.215386931" lastFinishedPulling="2026-04-24 21:48:39.494723229 +0000 UTC m=+1271.918230613" observedRunningTime="2026-04-24 21:48:39.764952394 +0000 UTC m=+1272.188459800" watchObservedRunningTime="2026-04-24 21:48:39.767356788 +0000 UTC m=+1272.190864193" Apr 24 21:48:40.748282 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:40.748250 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:40.748282 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:40.748283 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:48:46.756193 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:48:46.756160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:49:16.758757 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:16.758725 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:49:46.759948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:46.759881 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:49:52.447842 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.447811 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh"] Apr 24 21:49:52.448330 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.448167 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" containerID="cri-o://73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922" gracePeriod=30 Apr 24 21:49:52.448330 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.448207 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-agent" containerID="cri-o://6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378" gracePeriod=30 Apr 24 21:49:52.448522 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.448473 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" containerID="cri-o://a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317" gracePeriod=30 Apr 24 21:49:52.519606 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519582 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl"] Apr 24 21:49:52.519853 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519841 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="storage-initializer" Apr 24 21:49:52.519900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519854 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="storage-initializer" Apr 24 21:49:52.519900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519863 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kube-rbac-proxy" Apr 24 21:49:52.519900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519869 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kube-rbac-proxy" Apr 24 21:49:52.519900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519881 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kserve-container" Apr 24 21:49:52.519900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519886 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kserve-container" Apr 24 21:49:52.520067 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519937 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kube-rbac-proxy" Apr 24 21:49:52.520067 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.519944 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="71779215-5d0e-403a-ac2e-1b31342ae166" containerName="kserve-container" Apr 24 21:49:52.522876 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.522861 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.525035 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.524995 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 24 21:49:52.525035 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.525015 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 24 21:49:52.538447 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.538427 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl"] Apr 24 21:49:52.646449 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.646424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424d4c27-d7a2-4840-b8c6-3a86bc106059-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.646540 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.646468 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h8tt\" (UniqueName: \"kubernetes.io/projected/424d4c27-d7a2-4840-b8c6-3a86bc106059-kube-api-access-6h8tt\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.646540 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.646493 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/424d4c27-d7a2-4840-b8c6-3a86bc106059-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.646540 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.646531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.746943 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.746876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/424d4c27-d7a2-4840-b8c6-3a86bc106059-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.746943 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.746916 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.747202 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.746968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424d4c27-d7a2-4840-b8c6-3a86bc106059-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.747202 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:49:52.747048 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-predictor-serving-cert: secret "isvc-paddle-predictor-serving-cert" not found Apr 24 21:49:52.747202 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.747055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h8tt\" (UniqueName: \"kubernetes.io/projected/424d4c27-d7a2-4840-b8c6-3a86bc106059-kube-api-access-6h8tt\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.747202 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:49:52.747114 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls podName:424d4c27-d7a2-4840-b8c6-3a86bc106059 nodeName:}" failed. No retries permitted until 2026-04-24 21:49:53.247094824 +0000 UTC m=+1345.670602208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls") pod "isvc-paddle-predictor-6b8b7cfb4b-s5jwl" (UID: "424d4c27-d7a2-4840-b8c6-3a86bc106059") : secret "isvc-paddle-predictor-serving-cert" not found Apr 24 21:49:52.747368 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.747336 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424d4c27-d7a2-4840-b8c6-3a86bc106059-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.747631 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.747614 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/424d4c27-d7a2-4840-b8c6-3a86bc106059-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.756320 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.756298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h8tt\" (UniqueName: \"kubernetes.io/projected/424d4c27-d7a2-4840-b8c6-3a86bc106059-kube-api-access-6h8tt\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:52.944510 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.944478 2570 generic.go:358] "Generic (PLEG): container finished" podID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerID="a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317" exitCode=2 Apr 24 21:49:52.944666 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:52.944554 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerDied","Data":"a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317"} Apr 24 21:49:53.252044 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:53.251992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:53.254480 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:53.254453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-s5jwl\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:53.432083 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:53.432017 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:49:53.548365 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:53.548337 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl"] Apr 24 21:49:53.551342 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:49:53.551315 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424d4c27_d7a2_4840_b8c6_3a86bc106059.slice/crio-54f93053c274d72b5680dea2919fd613efc08fe5192a76de1adb83bfcaf2409e WatchSource:0}: Error finding container 54f93053c274d72b5680dea2919fd613efc08fe5192a76de1adb83bfcaf2409e: Status 404 returned error can't find the container with id 54f93053c274d72b5680dea2919fd613efc08fe5192a76de1adb83bfcaf2409e Apr 24 21:49:53.949577 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:53.949541 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerStarted","Data":"2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada"} Apr 24 21:49:53.949577 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:53.949582 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerStarted","Data":"54f93053c274d72b5680dea2919fd613efc08fe5192a76de1adb83bfcaf2409e"} Apr 24 21:49:54.954565 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:54.954538 2570 generic.go:358] "Generic (PLEG): container finished" podID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerID="73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922" exitCode=0 Apr 24 21:49:54.954910 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:54.954616 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerDied","Data":"73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922"} Apr 24 21:49:56.752373 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:56.752334 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 21:49:56.756718 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:56.756687 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:49:58.967259 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:58.967226 2570 generic.go:358] "Generic (PLEG): container finished" podID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerID="2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada" exitCode=0 Apr 24 21:49:58.967635 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:49:58.967285 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerDied","Data":"2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada"} Apr 24 21:50:01.751365 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:01.751325 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 21:50:06.751905 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:06.751853 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 21:50:06.752377 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:06.752015 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:50:06.757390 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:06.757362 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:50:10.005274 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:10.005236 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerStarted","Data":"f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b"} Apr 24 21:50:10.005274 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:10.005271 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerStarted","Data":"7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4"} Apr 24 21:50:10.005768 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:10.005610 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:50:10.005768 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:10.005742 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:50:10.006886 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:10.006863 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:10.024471 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:10.024431 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podStartSLOduration=7.715267953 podStartE2EDuration="18.024419276s" podCreationTimestamp="2026-04-24 21:49:52 +0000 UTC" firstStartedPulling="2026-04-24 21:49:58.968330292 +0000 UTC m=+1351.391837676" lastFinishedPulling="2026-04-24 21:50:09.277481615 +0000 UTC m=+1361.700988999" observedRunningTime="2026-04-24 21:50:10.023581173 +0000 UTC m=+1362.447088592" watchObservedRunningTime="2026-04-24 21:50:10.024419276 +0000 UTC m=+1362.447926682" Apr 24 21:50:11.008503 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:11.008464 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:11.751667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:11.751630 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 21:50:12.011736 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:12.011640 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:16.751972 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:16.751930 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 21:50:16.757358 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:16.757327 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 21:50:16.757441 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:16.757424 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:50:17.016454 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:17.016371 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:50:17.016886 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:17.016858 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:21.751662 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:21.751621 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 21:50:22.580136 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.580113 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:50:22.668119 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.668082 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls\") pod \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " Apr 24 21:50:22.668276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.668138 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vwgd\" (UniqueName: \"kubernetes.io/projected/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kube-api-access-5vwgd\") pod \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " Apr 24 21:50:22.668276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.668186 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kserve-provision-location\") pod \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " Apr 24 21:50:22.668276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.668230 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/578f12f9-dc65-478d-bdaa-ce53f23dfda6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\" (UID: \"578f12f9-dc65-478d-bdaa-ce53f23dfda6\") " Apr 24 21:50:22.668551 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.668525 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "578f12f9-dc65-478d-bdaa-ce53f23dfda6" (UID: "578f12f9-dc65-478d-bdaa-ce53f23dfda6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:22.668619 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.668590 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/578f12f9-dc65-478d-bdaa-ce53f23dfda6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "578f12f9-dc65-478d-bdaa-ce53f23dfda6" (UID: "578f12f9-dc65-478d-bdaa-ce53f23dfda6"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:22.670257 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.670235 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "578f12f9-dc65-478d-bdaa-ce53f23dfda6" (UID: "578f12f9-dc65-478d-bdaa-ce53f23dfda6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:22.670360 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.670322 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kube-api-access-5vwgd" (OuterVolumeSpecName: "kube-api-access-5vwgd") pod "578f12f9-dc65-478d-bdaa-ce53f23dfda6" (UID: "578f12f9-dc65-478d-bdaa-ce53f23dfda6"). InnerVolumeSpecName "kube-api-access-5vwgd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:22.768987 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.768932 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:50:22.768987 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.768953 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/578f12f9-dc65-478d-bdaa-ce53f23dfda6-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:50:22.768987 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.768965 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/578f12f9-dc65-478d-bdaa-ce53f23dfda6-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:50:22.768987 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:22.768974 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vwgd\" (UniqueName: \"kubernetes.io/projected/578f12f9-dc65-478d-bdaa-ce53f23dfda6-kube-api-access-5vwgd\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:50:23.043501 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.043422 2570 generic.go:358] "Generic (PLEG): container finished" podID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerID="6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378" exitCode=0 Apr 24 21:50:23.043501 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.043481 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerDied","Data":"6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378"} Apr 24 21:50:23.043685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.043513 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" event={"ID":"578f12f9-dc65-478d-bdaa-ce53f23dfda6","Type":"ContainerDied","Data":"e79050571ee3980f97d5e8cb510a906ce0628a252246cdba95969a760875d200"} Apr 24 21:50:23.043685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.043534 2570 scope.go:117] "RemoveContainer" containerID="a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317" Apr 24 21:50:23.043685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.043556 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh" Apr 24 21:50:23.051573 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.051552 2570 scope.go:117] "RemoveContainer" containerID="6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378" Apr 24 21:50:23.058685 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.058668 2570 scope.go:117] "RemoveContainer" containerID="73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922" Apr 24 21:50:23.064674 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.064652 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh"] Apr 24 21:50:23.065566 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.065544 2570 scope.go:117] "RemoveContainer" containerID="2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd" Apr 24 21:50:23.069421 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.069398 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57d594db69-zp9mh"] Apr 24 21:50:23.072377 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.072357 2570 scope.go:117] "RemoveContainer" containerID="a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317" Apr 24 21:50:23.072611 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:50:23.072594 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317\": container with ID starting with a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317 not found: ID does not exist" containerID="a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317" Apr 24 21:50:23.072652 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.072619 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317"} err="failed to get container status \"a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317\": rpc error: code = NotFound desc = could not find container \"a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317\": container with ID starting with a12c912fc7123387b7298c5e81c11a07831d980746de499e5903dd1233090317 not found: ID does not exist" Apr 24 21:50:23.072652 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.072635 2570 scope.go:117] "RemoveContainer" containerID="6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378" Apr 24 21:50:23.072848 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:50:23.072832 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378\": container with ID starting with 6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378 not found: ID does not exist" containerID="6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378" Apr 24 21:50:23.072893 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.072851 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378"} err="failed to get container status \"6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378\": rpc error: code = NotFound desc = could not find container \"6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378\": container with ID starting with 6ae6711fcbab86209b8d9fb62f3f1cfaa099de90fb66d5586c0af46439531378 not found: ID does not exist" Apr 24 21:50:23.072893 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.072865 2570 scope.go:117] "RemoveContainer" containerID="73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922" Apr 24 21:50:23.073064 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:50:23.073046 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922\": container with ID starting with 73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922 not found: ID does not exist" containerID="73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922" Apr 24 21:50:23.073111 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.073072 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922"} err="failed to get container status \"73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922\": rpc error: code = NotFound desc = could not find container \"73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922\": container with ID starting with 73baf8f98470b3c31c5445855cd9d41b62f80526af0e3d2d18d9107021470922 not found: ID does not exist" Apr 24 21:50:23.073111 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.073089 2570 scope.go:117] "RemoveContainer" containerID="2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd" Apr 24 21:50:23.073306 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:50:23.073283 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd\": container with ID starting with 2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd not found: ID does not exist" containerID="2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd" Apr 24 21:50:23.073344 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:23.073311 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd"} err="failed to get container status \"2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd\": rpc error: code = NotFound desc = could not find container \"2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd\": container with ID starting with 2d490d2fec4f2a94b7022a3eecada006f0e69594eb71840e2cb11f967f63dcbd not found: ID does not exist" Apr 24 21:50:24.066138 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:24.066108 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" path="/var/lib/kubelet/pods/578f12f9-dc65-478d-bdaa-ce53f23dfda6/volumes" Apr 24 21:50:27.017067 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:27.017005 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:37.017420 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:37.017377 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:47.017670 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:47.017633 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 21:50:57.018128 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:50:57.018099 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:51:03.993405 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:03.993367 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl"] Apr 24 21:51:03.993886 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:03.993769 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" containerID="cri-o://7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4" gracePeriod=30 Apr 24 21:51:03.993886 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:03.993780 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kube-rbac-proxy" containerID="cri-o://f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b" gracePeriod=30 Apr 24 21:51:04.121224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121193 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7"] Apr 24 21:51:04.121537 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121521 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-agent" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121540 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-agent" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121552 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121561 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121575 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121584 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121600 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="storage-initializer" Apr 24 21:51:04.121614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121608 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="storage-initializer" Apr 24 21:51:04.121948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121683 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-container" Apr 24 21:51:04.121948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121695 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kserve-agent" Apr 24 21:51:04.121948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.121708 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="578f12f9-dc65-478d-bdaa-ce53f23dfda6" containerName="kube-rbac-proxy" Apr 24 21:51:04.124603 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.124580 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.126648 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.126628 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 24 21:51:04.126763 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.126669 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:51:04.133857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.133834 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7"] Apr 24 21:51:04.161797 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.161748 2570 generic.go:358] "Generic (PLEG): container finished" podID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerID="f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b" exitCode=2 Apr 24 21:51:04.161966 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.161811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerDied","Data":"f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b"} Apr 24 21:51:04.246457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.246359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8ms8\" (UniqueName: \"kubernetes.io/projected/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kube-api-access-t8ms8\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.246457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.246409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.246457 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.246434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.246698 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.246533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.347476 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.347444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.347629 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.347497 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8ms8\" (UniqueName: \"kubernetes.io/projected/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kube-api-access-t8ms8\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.347629 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.347544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.347629 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.347580 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.347876 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.347853 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.348311 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.348219 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.350049 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.350006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.355683 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.355658 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8ms8\" (UniqueName: \"kubernetes.io/projected/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kube-api-access-t8ms8\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.435946 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.435914 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:04.556755 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:04.556726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7"] Apr 24 21:51:04.560199 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:51:04.560164 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc91f974e_bcaa_4aff_9b0f_35fee97da7f0.slice/crio-9b084f3c5522e67b7949a5e77baa32c0528477dc26d4b8aab4ba7c2b51899030 WatchSource:0}: Error finding container 9b084f3c5522e67b7949a5e77baa32c0528477dc26d4b8aab4ba7c2b51899030: Status 404 returned error can't find the container with id 9b084f3c5522e67b7949a5e77baa32c0528477dc26d4b8aab4ba7c2b51899030 Apr 24 21:51:05.166528 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:05.166488 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerStarted","Data":"94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e"} Apr 24 21:51:05.166528 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:05.166531 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerStarted","Data":"9b084f3c5522e67b7949a5e77baa32c0528477dc26d4b8aab4ba7c2b51899030"} Apr 24 21:51:06.624281 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.624260 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:51:06.664467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.664430 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/424d4c27-d7a2-4840-b8c6-3a86bc106059-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"424d4c27-d7a2-4840-b8c6-3a86bc106059\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " Apr 24 21:51:06.664601 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.664506 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h8tt\" (UniqueName: \"kubernetes.io/projected/424d4c27-d7a2-4840-b8c6-3a86bc106059-kube-api-access-6h8tt\") pod \"424d4c27-d7a2-4840-b8c6-3a86bc106059\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " Apr 24 21:51:06.664601 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.664545 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls\") pod \"424d4c27-d7a2-4840-b8c6-3a86bc106059\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " Apr 24 21:51:06.664601 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.664588 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424d4c27-d7a2-4840-b8c6-3a86bc106059-kserve-provision-location\") pod \"424d4c27-d7a2-4840-b8c6-3a86bc106059\" (UID: \"424d4c27-d7a2-4840-b8c6-3a86bc106059\") " Apr 24 21:51:06.664795 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.664772 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/424d4c27-d7a2-4840-b8c6-3a86bc106059-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "424d4c27-d7a2-4840-b8c6-3a86bc106059" (UID: "424d4c27-d7a2-4840-b8c6-3a86bc106059"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:51:06.666796 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.666768 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "424d4c27-d7a2-4840-b8c6-3a86bc106059" (UID: "424d4c27-d7a2-4840-b8c6-3a86bc106059"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:51:06.666897 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.666806 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424d4c27-d7a2-4840-b8c6-3a86bc106059-kube-api-access-6h8tt" (OuterVolumeSpecName: "kube-api-access-6h8tt") pod "424d4c27-d7a2-4840-b8c6-3a86bc106059" (UID: "424d4c27-d7a2-4840-b8c6-3a86bc106059"). InnerVolumeSpecName "kube-api-access-6h8tt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:51:06.674357 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.674334 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424d4c27-d7a2-4840-b8c6-3a86bc106059-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "424d4c27-d7a2-4840-b8c6-3a86bc106059" (UID: "424d4c27-d7a2-4840-b8c6-3a86bc106059"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:51:06.765194 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.765095 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/424d4c27-d7a2-4840-b8c6-3a86bc106059-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:51:06.765194 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.765130 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/424d4c27-d7a2-4840-b8c6-3a86bc106059-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:51:06.765194 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.765144 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6h8tt\" (UniqueName: \"kubernetes.io/projected/424d4c27-d7a2-4840-b8c6-3a86bc106059-kube-api-access-6h8tt\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:51:06.765194 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:06.765156 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/424d4c27-d7a2-4840-b8c6-3a86bc106059-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:51:07.173681 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.173644 2570 generic.go:358] "Generic (PLEG): container finished" podID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerID="7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4" exitCode=0 Apr 24 21:51:07.173850 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.173720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerDied","Data":"7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4"} Apr 24 21:51:07.173850 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.173738 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" Apr 24 21:51:07.173850 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.173753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl" event={"ID":"424d4c27-d7a2-4840-b8c6-3a86bc106059","Type":"ContainerDied","Data":"54f93053c274d72b5680dea2919fd613efc08fe5192a76de1adb83bfcaf2409e"} Apr 24 21:51:07.173850 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.173774 2570 scope.go:117] "RemoveContainer" containerID="f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b" Apr 24 21:51:07.182298 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.182280 2570 scope.go:117] "RemoveContainer" containerID="7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4" Apr 24 21:51:07.189356 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.189338 2570 scope.go:117] "RemoveContainer" containerID="2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada" Apr 24 21:51:07.195686 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.195659 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl"] Apr 24 21:51:07.197119 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.197108 2570 scope.go:117] "RemoveContainer" containerID="f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b" Apr 24 21:51:07.197396 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:51:07.197378 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b\": container with ID starting with f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b not found: ID does not exist" containerID="f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b" Apr 24 21:51:07.197445 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.197405 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b"} err="failed to get container status \"f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b\": rpc error: code = NotFound desc = could not find container \"f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b\": container with ID starting with f01baca0826cb1e587aa14e980f32d5c3a619be9ce602c4918bc5c8da28ab59b not found: ID does not exist" Apr 24 21:51:07.197445 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.197422 2570 scope.go:117] "RemoveContainer" containerID="7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4" Apr 24 21:51:07.197647 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:51:07.197630 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4\": container with ID starting with 7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4 not found: ID does not exist" containerID="7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4" Apr 24 21:51:07.197708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.197658 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4"} err="failed to get container status \"7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4\": rpc error: code = NotFound desc = could not find container \"7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4\": container with ID starting with 7ba795ecb69a45e563671a842f6439f1485708c03693b59fe3b158cda5b692c4 not found: ID does not exist" Apr 24 21:51:07.197708 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.197684 2570 scope.go:117] "RemoveContainer" containerID="2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada" Apr 24 21:51:07.197932 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:51:07.197918 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada\": container with ID starting with 2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada not found: ID does not exist" containerID="2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada" Apr 24 21:51:07.197983 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.197934 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada"} err="failed to get container status \"2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada\": rpc error: code = NotFound desc = could not find container \"2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada\": container with ID starting with 2e0f5f29e8ca16aa40a34adfa16f8b232b02bd5b1698ba2f2cda5b2d4c6f4ada not found: ID does not exist" Apr 24 21:51:07.200939 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:07.200920 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-s5jwl"] Apr 24 21:51:08.067041 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:08.066995 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" path="/var/lib/kubelet/pods/424d4c27-d7a2-4840-b8c6-3a86bc106059/volumes" Apr 24 21:51:09.181098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:09.181066 2570 generic.go:358] "Generic (PLEG): container finished" podID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerID="94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e" exitCode=0 Apr 24 21:51:09.181504 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:09.181109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerDied","Data":"94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e"} Apr 24 21:51:10.186302 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:10.186268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerStarted","Data":"6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda"} Apr 24 21:51:10.186302 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:10.186302 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerStarted","Data":"c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451"} Apr 24 21:51:10.186900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:10.186625 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:10.186900 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:10.186659 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:10.187875 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:10.187851 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:51:10.207314 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:10.207262 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podStartSLOduration=6.207243664 podStartE2EDuration="6.207243664s" podCreationTimestamp="2026-04-24 21:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:51:10.206516311 +0000 UTC m=+1422.630023717" watchObservedRunningTime="2026-04-24 21:51:10.207243664 +0000 UTC m=+1422.630751085" Apr 24 21:51:11.190314 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:11.190231 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:51:16.194725 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:16.194694 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:51:16.195306 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:16.195281 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:51:26.195616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:26.195574 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:51:36.196017 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:36.195975 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:51:46.195705 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:46.195666 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:51:56.196669 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:51:56.196640 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:52:05.617739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.617704 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7"] Apr 24 21:52:05.618315 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.618130 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" containerID="cri-o://c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451" gracePeriod=30 Apr 24 21:52:05.618315 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.618157 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kube-rbac-proxy" containerID="cri-o://6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda" gracePeriod=30 Apr 24 21:52:05.850854 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.850820 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq"] Apr 24 21:52:05.851140 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851128 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" Apr 24 21:52:05.851199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851142 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" Apr 24 21:52:05.851199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851154 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="storage-initializer" Apr 24 21:52:05.851199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851159 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="storage-initializer" Apr 24 21:52:05.851199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851166 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kube-rbac-proxy" Apr 24 21:52:05.851199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851172 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kube-rbac-proxy" Apr 24 21:52:05.851348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851214 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kube-rbac-proxy" Apr 24 21:52:05.851348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.851223 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="424d4c27-d7a2-4840-b8c6-3a86bc106059" containerName="kserve-container" Apr 24 21:52:05.854251 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.854236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:05.856869 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.856846 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:52:05.857443 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.857422 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 24 21:52:05.869060 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.869000 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq"] Apr 24 21:52:05.963501 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.963473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:05.963612 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.963505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbf6h\" (UniqueName: \"kubernetes.io/projected/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kube-api-access-qbf6h\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:05.963612 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.963552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:05.963612 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:05.963578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.063916 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.063884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.063916 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.063915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbf6h\" (UniqueName: \"kubernetes.io/projected/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kube-api-access-qbf6h\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.064118 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.063944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.064118 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.064077 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.064468 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.064442 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.064718 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.064694 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.066838 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.066815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.072597 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.072574 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbf6h\" (UniqueName: \"kubernetes.io/projected/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kube-api-access-qbf6h\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.164095 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.164073 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:06.190968 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.190936 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 24 21:52:06.195274 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.195242 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 21:52:06.285223 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.285195 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq"] Apr 24 21:52:06.287590 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:52:06.287551 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa5b575_2889_4bc1_88cf_4dfcc4e84ad0.slice/crio-579ff4bf902aaf2528c9f2d4fa18cc240e6fdf39ee2ae3844b86bf60aa8e1acb WatchSource:0}: Error finding container 579ff4bf902aaf2528c9f2d4fa18cc240e6fdf39ee2ae3844b86bf60aa8e1acb: Status 404 returned error can't find the container with id 579ff4bf902aaf2528c9f2d4fa18cc240e6fdf39ee2ae3844b86bf60aa8e1acb Apr 24 21:52:06.289356 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.289340 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:52:06.332692 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.332669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerStarted","Data":"579ff4bf902aaf2528c9f2d4fa18cc240e6fdf39ee2ae3844b86bf60aa8e1acb"} Apr 24 21:52:06.334619 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.334596 2570 generic.go:358] "Generic (PLEG): container finished" podID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerID="6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda" exitCode=2 Apr 24 21:52:06.334715 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:06.334670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerDied","Data":"6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda"} Apr 24 21:52:07.338954 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:07.338915 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerStarted","Data":"82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb"} Apr 24 21:52:08.160736 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.160713 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:52:08.279467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.279370 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8ms8\" (UniqueName: \"kubernetes.io/projected/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kube-api-access-t8ms8\") pod \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " Apr 24 21:52:08.279467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.279411 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-proxy-tls\") pod \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " Apr 24 21:52:08.279467 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.279454 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kserve-provision-location\") pod \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " Apr 24 21:52:08.279745 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.279489 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\" (UID: \"c91f974e-bcaa-4aff-9b0f-35fee97da7f0\") " Apr 24 21:52:08.279974 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.279946 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "c91f974e-bcaa-4aff-9b0f-35fee97da7f0" (UID: "c91f974e-bcaa-4aff-9b0f-35fee97da7f0"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:52:08.282119 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.282084 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c91f974e-bcaa-4aff-9b0f-35fee97da7f0" (UID: "c91f974e-bcaa-4aff-9b0f-35fee97da7f0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:08.282204 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.282125 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kube-api-access-t8ms8" (OuterVolumeSpecName: "kube-api-access-t8ms8") pod "c91f974e-bcaa-4aff-9b0f-35fee97da7f0" (UID: "c91f974e-bcaa-4aff-9b0f-35fee97da7f0"). InnerVolumeSpecName "kube-api-access-t8ms8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:08.291105 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.291075 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c91f974e-bcaa-4aff-9b0f-35fee97da7f0" (UID: "c91f974e-bcaa-4aff-9b0f-35fee97da7f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:08.343130 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.343101 2570 generic.go:358] "Generic (PLEG): container finished" podID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerID="c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451" exitCode=0 Apr 24 21:52:08.343458 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.343186 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" Apr 24 21:52:08.343458 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.343188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerDied","Data":"c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451"} Apr 24 21:52:08.343458 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.343224 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7" event={"ID":"c91f974e-bcaa-4aff-9b0f-35fee97da7f0","Type":"ContainerDied","Data":"9b084f3c5522e67b7949a5e77baa32c0528477dc26d4b8aab4ba7c2b51899030"} Apr 24 21:52:08.343458 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.343240 2570 scope.go:117] "RemoveContainer" containerID="6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda" Apr 24 21:52:08.351241 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.351000 2570 scope.go:117] "RemoveContainer" containerID="c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451" Apr 24 21:52:08.358157 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.358143 2570 scope.go:117] "RemoveContainer" containerID="94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e" Apr 24 21:52:08.364706 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.364685 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7"] Apr 24 21:52:08.364991 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.364978 2570 scope.go:117] "RemoveContainer" containerID="6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda" Apr 24 21:52:08.365274 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:52:08.365255 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda\": container with ID starting with 6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda not found: ID does not exist" containerID="6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda" Apr 24 21:52:08.365338 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.365282 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda"} err="failed to get container status \"6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda\": rpc error: code = NotFound desc = could not find container \"6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda\": container with ID starting with 6991dfea52d7e055095176bc385c5f30c7fabb78b11a665e7bfe58d6ae89abda not found: ID does not exist" Apr 24 21:52:08.365338 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.365300 2570 scope.go:117] "RemoveContainer" containerID="c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451" Apr 24 21:52:08.365524 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:52:08.365509 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451\": container with ID starting with c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451 not found: ID does not exist" containerID="c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451" Apr 24 21:52:08.365574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.365530 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451"} err="failed to get container status \"c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451\": rpc error: code = NotFound desc = could not find container \"c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451\": container with ID starting with c7828a8586da8ebf4676e891e2e356fc160c228f8eacc83ab2c528b552bc8451 not found: ID does not exist" Apr 24 21:52:08.365574 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.365546 2570 scope.go:117] "RemoveContainer" containerID="94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e" Apr 24 21:52:08.365714 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:52:08.365700 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e\": container with ID starting with 94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e not found: ID does not exist" containerID="94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e" Apr 24 21:52:08.365791 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.365716 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e"} err="failed to get container status \"94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e\": rpc error: code = NotFound desc = could not find container \"94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e\": container with ID starting with 94bf3869eaac377f4aa0535389fa4626ed8fd2db15fedb94876887a9fb536d2e not found: ID does not exist" Apr 24 21:52:08.371610 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.371588 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-n5xp7"] Apr 24 21:52:08.380656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.380633 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:52:08.380656 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.380654 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t8ms8\" (UniqueName: \"kubernetes.io/projected/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kube-api-access-t8ms8\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:52:08.380798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.380664 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:52:08.380798 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:08.380674 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c91f974e-bcaa-4aff-9b0f-35fee97da7f0-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:52:10.066926 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:10.066891 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" path="/var/lib/kubelet/pods/c91f974e-bcaa-4aff-9b0f-35fee97da7f0/volumes" Apr 24 21:52:11.352707 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:11.352672 2570 generic.go:358] "Generic (PLEG): container finished" podID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerID="82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb" exitCode=0 Apr 24 21:52:11.353101 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:11.352736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerDied","Data":"82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb"} Apr 24 21:52:12.357189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:12.357144 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerStarted","Data":"25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c"} Apr 24 21:52:12.357189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:12.357194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerStarted","Data":"9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef"} Apr 24 21:52:12.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:12.357485 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:12.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:12.357609 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:12.358910 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:12.358883 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:52:12.379953 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:12.379912 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podStartSLOduration=7.379901766 podStartE2EDuration="7.379901766s" podCreationTimestamp="2026-04-24 21:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:52:12.378695022 +0000 UTC m=+1484.802202429" watchObservedRunningTime="2026-04-24 21:52:12.379901766 +0000 UTC m=+1484.803409173" Apr 24 21:52:13.360213 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:13.360167 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:52:18.364749 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:18.364724 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:52:18.365389 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:18.365362 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:52:28.064944 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:28.064915 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:52:28.068232 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:28.068207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:52:28.365753 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:28.365670 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:52:38.365948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:38.365911 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:52:48.366260 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:48.366184 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:52:58.366503 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:52:58.366473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:53:07.266819 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.266776 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq"] Apr 24 21:53:07.267291 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.267115 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" containerID="cri-o://9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef" gracePeriod=30 Apr 24 21:53:07.267291 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.267186 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kube-rbac-proxy" containerID="cri-o://25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c" gracePeriod=30 Apr 24 21:53:07.357322 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357295 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7"] Apr 24 21:53:07.357570 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357559 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kube-rbac-proxy" Apr 24 21:53:07.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357572 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kube-rbac-proxy" Apr 24 21:53:07.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357582 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" Apr 24 21:53:07.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357588 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" Apr 24 21:53:07.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357597 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="storage-initializer" Apr 24 21:53:07.357616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357603 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="storage-initializer" Apr 24 21:53:07.357759 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357658 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kube-rbac-proxy" Apr 24 21:53:07.357759 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.357667 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c91f974e-bcaa-4aff-9b0f-35fee97da7f0" containerName="kserve-container" Apr 24 21:53:07.360739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.360722 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.363278 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.363246 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 24 21:53:07.363278 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.363264 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 24 21:53:07.368791 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.368770 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7"] Apr 24 21:53:07.484725 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.484693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df063b6c-c298-404c-b8ca-4b4cc846593f-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.484843 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.484740 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df063b6c-c298-404c-b8ca-4b4cc846593f-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.484843 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.484775 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2km\" (UniqueName: \"kubernetes.io/projected/df063b6c-c298-404c-b8ca-4b4cc846593f-kube-api-access-tl2km\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.484843 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.484833 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df063b6c-c298-404c-b8ca-4b4cc846593f-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.515516 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.515489 2570 generic.go:358] "Generic (PLEG): container finished" podID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerID="25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c" exitCode=2 Apr 24 21:53:07.515637 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.515568 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerDied","Data":"25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c"} Apr 24 21:53:07.585937 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.585857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df063b6c-c298-404c-b8ca-4b4cc846593f-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.585937 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.585900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df063b6c-c298-404c-b8ca-4b4cc846593f-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.585937 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.585934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df063b6c-c298-404c-b8ca-4b4cc846593f-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.586236 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.585959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2km\" (UniqueName: \"kubernetes.io/projected/df063b6c-c298-404c-b8ca-4b4cc846593f-kube-api-access-tl2km\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.586348 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.586328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df063b6c-c298-404c-b8ca-4b4cc846593f-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.586562 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.586544 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df063b6c-c298-404c-b8ca-4b4cc846593f-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.588632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.588606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df063b6c-c298-404c-b8ca-4b4cc846593f-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.594221 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.594199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2km\" (UniqueName: \"kubernetes.io/projected/df063b6c-c298-404c-b8ca-4b4cc846593f-kube-api-access-tl2km\") pod \"isvc-pmml-predictor-8bb578669-4xvw7\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.671231 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.671207 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:07.789949 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:07.789920 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7"] Apr 24 21:53:07.792906 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:53:07.792879 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf063b6c_c298_404c_b8ca_4b4cc846593f.slice/crio-a45d8e3f3f26862843135bdd78a8988dbe6a4245c4c51e72f7d706535fb151d2 WatchSource:0}: Error finding container a45d8e3f3f26862843135bdd78a8988dbe6a4245c4c51e72f7d706535fb151d2: Status 404 returned error can't find the container with id a45d8e3f3f26862843135bdd78a8988dbe6a4245c4c51e72f7d706535fb151d2 Apr 24 21:53:08.360950 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:08.360903 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.33:8643/healthz\": dial tcp 10.133.0.33:8643: connect: connection refused" Apr 24 21:53:08.366192 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:08.366167 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 21:53:08.519899 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:08.519858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerStarted","Data":"55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0"} Apr 24 21:53:08.519899 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:08.519904 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerStarted","Data":"a45d8e3f3f26862843135bdd78a8988dbe6a4245c4c51e72f7d706535fb151d2"} Apr 24 21:53:09.807995 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.807975 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:53:09.904398 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.904338 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbf6h\" (UniqueName: \"kubernetes.io/projected/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kube-api-access-qbf6h\") pod \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " Apr 24 21:53:09.904398 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.904378 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " Apr 24 21:53:09.904555 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.904418 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kserve-provision-location\") pod \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " Apr 24 21:53:09.904555 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.904448 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-proxy-tls\") pod \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\" (UID: \"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0\") " Apr 24 21:53:09.904771 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.904748 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" (UID: "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:53:09.906553 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.906526 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kube-api-access-qbf6h" (OuterVolumeSpecName: "kube-api-access-qbf6h") pod "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" (UID: "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0"). InnerVolumeSpecName "kube-api-access-qbf6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:53:09.906667 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.906650 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" (UID: "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:53:09.919302 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:09.919275 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" (UID: "dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:10.005086 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.005065 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbf6h\" (UniqueName: \"kubernetes.io/projected/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kube-api-access-qbf6h\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:53:10.005086 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.005086 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:53:10.005199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.005096 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:53:10.005199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.005105 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:53:10.529450 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.529418 2570 generic.go:358] "Generic (PLEG): container finished" podID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerID="9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef" exitCode=0 Apr 24 21:53:10.529632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.529476 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerDied","Data":"9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef"} Apr 24 21:53:10.529632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.529514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" event={"ID":"dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0","Type":"ContainerDied","Data":"579ff4bf902aaf2528c9f2d4fa18cc240e6fdf39ee2ae3844b86bf60aa8e1acb"} Apr 24 21:53:10.529632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.529490 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq" Apr 24 21:53:10.529632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.529531 2570 scope.go:117] "RemoveContainer" containerID="25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c" Apr 24 21:53:10.537091 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.537076 2570 scope.go:117] "RemoveContainer" containerID="9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef" Apr 24 21:53:10.543891 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.543875 2570 scope.go:117] "RemoveContainer" containerID="82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb" Apr 24 21:53:10.549322 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.549302 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq"] Apr 24 21:53:10.550789 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.550774 2570 scope.go:117] "RemoveContainer" containerID="25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c" Apr 24 21:53:10.551040 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:53:10.550985 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c\": container with ID starting with 25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c not found: ID does not exist" containerID="25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c" Apr 24 21:53:10.551166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.551041 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c"} err="failed to get container status \"25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c\": rpc error: code = NotFound desc = could not find container \"25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c\": container with ID starting with 25a3ec0cc26baeee2e283a69d379b61babe61f7b966d24676626de75d44d569c not found: ID does not exist" Apr 24 21:53:10.551166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.551058 2570 scope.go:117] "RemoveContainer" containerID="9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef" Apr 24 21:53:10.551362 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:53:10.551342 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef\": container with ID starting with 9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef not found: ID does not exist" containerID="9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef" Apr 24 21:53:10.551442 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.551371 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef"} err="failed to get container status \"9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef\": rpc error: code = NotFound desc = could not find container \"9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef\": container with ID starting with 9800dbe2430e3ec0aaa6df0144af5e51ef09da343c595b05d2f85f756bd108ef not found: ID does not exist" Apr 24 21:53:10.551442 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.551392 2570 scope.go:117] "RemoveContainer" containerID="82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb" Apr 24 21:53:10.551725 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:53:10.551698 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb\": container with ID starting with 82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb not found: ID does not exist" containerID="82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb" Apr 24 21:53:10.551808 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.551733 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb"} err="failed to get container status \"82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb\": rpc error: code = NotFound desc = could not find container \"82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb\": container with ID starting with 82ec32b2a44f49dd6655c201ac8a5ded0ebd60d0add8a4dfc92e4ae23b9bd3eb not found: ID does not exist" Apr 24 21:53:10.553458 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:10.553440 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-fzdbq"] Apr 24 21:53:11.534274 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:11.534239 2570 generic.go:358] "Generic (PLEG): container finished" podID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerID="55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0" exitCode=0 Apr 24 21:53:11.534657 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:11.534311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerDied","Data":"55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0"} Apr 24 21:53:12.079311 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:12.079271 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" path="/var/lib/kubelet/pods/dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0/volumes" Apr 24 21:53:18.556590 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:18.556516 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerStarted","Data":"a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65"} Apr 24 21:53:18.556590 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:18.556554 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerStarted","Data":"ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0"} Apr 24 21:53:18.556949 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:18.556750 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:18.577038 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:18.576974 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podStartSLOduration=4.888855814 podStartE2EDuration="11.57696025s" podCreationTimestamp="2026-04-24 21:53:07 +0000 UTC" firstStartedPulling="2026-04-24 21:53:11.535441961 +0000 UTC m=+1543.958949346" lastFinishedPulling="2026-04-24 21:53:18.223546398 +0000 UTC m=+1550.647053782" observedRunningTime="2026-04-24 21:53:18.575679226 +0000 UTC m=+1550.999186634" watchObservedRunningTime="2026-04-24 21:53:18.57696025 +0000 UTC m=+1551.000467656" Apr 24 21:53:19.559760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:19.559720 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:19.561057 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:19.561015 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:53:20.562732 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:20.562693 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:53:25.567219 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:25.567190 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:53:25.569578 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:25.567633 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:53:35.567647 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:35.567605 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:53:45.567634 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:45.567596 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:53:55.568122 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:53:55.568080 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:54:05.568380 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:05.568339 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:54:15.568279 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:15.568196 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:54:25.568330 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:25.568289 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 21:54:34.067002 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:34.066975 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:54:38.385082 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.385016 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7"] Apr 24 21:54:38.385542 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.385430 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" containerID="cri-o://ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0" gracePeriod=30 Apr 24 21:54:38.385617 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.385491 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kube-rbac-proxy" containerID="cri-o://a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65" gracePeriod=30 Apr 24 21:54:38.510847 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.510818 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w"] Apr 24 21:54:38.511144 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511131 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" Apr 24 21:54:38.511189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511146 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" Apr 24 21:54:38.511189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511157 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="storage-initializer" Apr 24 21:54:38.511189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511164 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="storage-initializer" Apr 24 21:54:38.511189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511178 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kube-rbac-proxy" Apr 24 21:54:38.511189 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511184 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kube-rbac-proxy" Apr 24 21:54:38.511341 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511240 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kube-rbac-proxy" Apr 24 21:54:38.511341 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.511250 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfa5b575-2889-4bc1-88cf-4dfcc4e84ad0" containerName="kserve-container" Apr 24 21:54:38.514059 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.514043 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.518904 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.518884 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 24 21:54:38.519011 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.518887 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 24 21:54:38.538879 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.538859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w"] Apr 24 21:54:38.610149 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.610124 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3971e441-4130-49c0-9fec-01afff2b0ce1-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.610252 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.610155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3971e441-4130-49c0-9fec-01afff2b0ce1-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.610252 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.610172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3971e441-4130-49c0-9fec-01afff2b0ce1-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.610252 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.610190 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7kk4\" (UniqueName: \"kubernetes.io/projected/3971e441-4130-49c0-9fec-01afff2b0ce1-kube-api-access-q7kk4\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.710522 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.710500 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3971e441-4130-49c0-9fec-01afff2b0ce1-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.710642 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.710532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3971e441-4130-49c0-9fec-01afff2b0ce1-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.710642 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.710550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3971e441-4130-49c0-9fec-01afff2b0ce1-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.710747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.710655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7kk4\" (UniqueName: \"kubernetes.io/projected/3971e441-4130-49c0-9fec-01afff2b0ce1-kube-api-access-q7kk4\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.710915 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.710895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3971e441-4130-49c0-9fec-01afff2b0ce1-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.711205 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.711187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3971e441-4130-49c0-9fec-01afff2b0ce1-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.712981 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.712963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3971e441-4130-49c0-9fec-01afff2b0ce1-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.721111 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.721089 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7kk4\" (UniqueName: \"kubernetes.io/projected/3971e441-4130-49c0-9fec-01afff2b0ce1-kube-api-access-q7kk4\") pod \"isvc-pmml-runtime-predictor-67bc544947-6vx4w\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.772906 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.772882 2570 generic.go:358] "Generic (PLEG): container finished" podID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerID="a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65" exitCode=2 Apr 24 21:54:38.773006 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.772951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerDied","Data":"a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65"} Apr 24 21:54:38.823417 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.823395 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:38.940271 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:38.940237 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w"] Apr 24 21:54:38.943971 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:54:38.943944 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3971e441_4130_49c0_9fec_01afff2b0ce1.slice/crio-f7fd657c6570129e751f03bdae2d44becce6752d14e35bbd0f46101e9dc92aae WatchSource:0}: Error finding container f7fd657c6570129e751f03bdae2d44becce6752d14e35bbd0f46101e9dc92aae: Status 404 returned error can't find the container with id f7fd657c6570129e751f03bdae2d44becce6752d14e35bbd0f46101e9dc92aae Apr 24 21:54:39.777374 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:39.777324 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerStarted","Data":"c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920"} Apr 24 21:54:39.777374 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:39.777369 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerStarted","Data":"f7fd657c6570129e751f03bdae2d44becce6752d14e35bbd0f46101e9dc92aae"} Apr 24 21:54:40.563296 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:40.563260 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 24 21:54:41.515006 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.514986 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:54:41.631623 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.631560 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2km\" (UniqueName: \"kubernetes.io/projected/df063b6c-c298-404c-b8ca-4b4cc846593f-kube-api-access-tl2km\") pod \"df063b6c-c298-404c-b8ca-4b4cc846593f\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " Apr 24 21:54:41.631623 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.631597 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df063b6c-c298-404c-b8ca-4b4cc846593f-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"df063b6c-c298-404c-b8ca-4b4cc846593f\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " Apr 24 21:54:41.631832 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.631641 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df063b6c-c298-404c-b8ca-4b4cc846593f-kserve-provision-location\") pod \"df063b6c-c298-404c-b8ca-4b4cc846593f\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " Apr 24 21:54:41.631832 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.631665 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df063b6c-c298-404c-b8ca-4b4cc846593f-proxy-tls\") pod \"df063b6c-c298-404c-b8ca-4b4cc846593f\" (UID: \"df063b6c-c298-404c-b8ca-4b4cc846593f\") " Apr 24 21:54:41.631963 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.631938 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df063b6c-c298-404c-b8ca-4b4cc846593f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df063b6c-c298-404c-b8ca-4b4cc846593f" (UID: "df063b6c-c298-404c-b8ca-4b4cc846593f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:54:41.632064 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.632016 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df063b6c-c298-404c-b8ca-4b4cc846593f-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "df063b6c-c298-404c-b8ca-4b4cc846593f" (UID: "df063b6c-c298-404c-b8ca-4b4cc846593f"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:54:41.633665 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.633644 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df063b6c-c298-404c-b8ca-4b4cc846593f-kube-api-access-tl2km" (OuterVolumeSpecName: "kube-api-access-tl2km") pod "df063b6c-c298-404c-b8ca-4b4cc846593f" (UID: "df063b6c-c298-404c-b8ca-4b4cc846593f"). InnerVolumeSpecName "kube-api-access-tl2km". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:54:41.633755 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.633731 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df063b6c-c298-404c-b8ca-4b4cc846593f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df063b6c-c298-404c-b8ca-4b4cc846593f" (UID: "df063b6c-c298-404c-b8ca-4b4cc846593f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:54:41.732786 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.732763 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tl2km\" (UniqueName: \"kubernetes.io/projected/df063b6c-c298-404c-b8ca-4b4cc846593f-kube-api-access-tl2km\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:54:41.732786 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.732787 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df063b6c-c298-404c-b8ca-4b4cc846593f-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:54:41.732966 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.732797 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df063b6c-c298-404c-b8ca-4b4cc846593f-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:54:41.732966 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.732808 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df063b6c-c298-404c-b8ca-4b4cc846593f-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:54:41.784161 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.784136 2570 generic.go:358] "Generic (PLEG): container finished" podID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerID="ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0" exitCode=0 Apr 24 21:54:41.784276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.784172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerDied","Data":"ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0"} Apr 24 21:54:41.784276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.784194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" event={"ID":"df063b6c-c298-404c-b8ca-4b4cc846593f","Type":"ContainerDied","Data":"a45d8e3f3f26862843135bdd78a8988dbe6a4245c4c51e72f7d706535fb151d2"} Apr 24 21:54:41.784276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.784209 2570 scope.go:117] "RemoveContainer" containerID="a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65" Apr 24 21:54:41.784276 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.784215 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7" Apr 24 21:54:41.791717 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.791686 2570 scope.go:117] "RemoveContainer" containerID="ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0" Apr 24 21:54:41.798725 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.798711 2570 scope.go:117] "RemoveContainer" containerID="55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0" Apr 24 21:54:41.805218 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.805202 2570 scope.go:117] "RemoveContainer" containerID="a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65" Apr 24 21:54:41.805460 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:54:41.805435 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65\": container with ID starting with a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65 not found: ID does not exist" containerID="a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65" Apr 24 21:54:41.805558 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.805472 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65"} err="failed to get container status \"a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65\": rpc error: code = NotFound desc = could not find container \"a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65\": container with ID starting with a889c2908357c3d7bca358edfafcf17a3279faed413c1df888d26012d6f46f65 not found: ID does not exist" Apr 24 21:54:41.805558 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.805494 2570 scope.go:117] "RemoveContainer" containerID="ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0" Apr 24 21:54:41.805773 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:54:41.805725 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0\": container with ID starting with ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0 not found: ID does not exist" containerID="ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0" Apr 24 21:54:41.805826 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.805779 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0"} err="failed to get container status \"ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0\": rpc error: code = NotFound desc = could not find container \"ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0\": container with ID starting with ef433507eef2e93c05bb26f4d20dbf721b0d18c413d246a22929f6d5b2702be0 not found: ID does not exist" Apr 24 21:54:41.805826 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.805798 2570 scope.go:117] "RemoveContainer" containerID="55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0" Apr 24 21:54:41.806068 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:54:41.806049 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0\": container with ID starting with 55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0 not found: ID does not exist" containerID="55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0" Apr 24 21:54:41.806143 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.806071 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0"} err="failed to get container status \"55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0\": rpc error: code = NotFound desc = could not find container \"55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0\": container with ID starting with 55cf2d51ec7e35ac0ba2e78011cd9f33a4eb48a6405a769f213b7010cdf0c9c0 not found: ID does not exist" Apr 24 21:54:41.806259 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.806243 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7"] Apr 24 21:54:41.812390 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:41.812373 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-4xvw7"] Apr 24 21:54:42.067051 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:42.066913 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" path="/var/lib/kubelet/pods/df063b6c-c298-404c-b8ca-4b4cc846593f/volumes" Apr 24 21:54:42.788638 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:42.788606 2570 generic.go:358] "Generic (PLEG): container finished" podID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerID="c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920" exitCode=0 Apr 24 21:54:42.788934 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:42.788659 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerDied","Data":"c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920"} Apr 24 21:54:43.793393 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:43.793358 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerStarted","Data":"13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a"} Apr 24 21:54:43.793393 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:43.793396 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerStarted","Data":"b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126"} Apr 24 21:54:43.793869 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:43.793727 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:43.793869 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:43.793825 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:43.795094 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:43.795062 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:54:43.816360 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:43.816322 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podStartSLOduration=5.8163114270000005 podStartE2EDuration="5.816311427s" podCreationTimestamp="2026-04-24 21:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:43.814427178 +0000 UTC m=+1636.237934594" watchObservedRunningTime="2026-04-24 21:54:43.816311427 +0000 UTC m=+1636.239818832" Apr 24 21:54:44.797292 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:44.797254 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:54:45.799693 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:45.799656 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:54:50.803773 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:50.803745 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:54:50.804351 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:54:50.804324 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:55:00.804382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:55:00.804343 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:55:10.804773 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:55:10.804736 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:55:20.804772 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:55:20.804734 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:55:30.805174 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:55:30.805135 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:55:40.805139 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:55:40.805056 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:55:50.804382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:55:50.804338 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:56:00.805187 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:00.805148 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:56:09.488263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.488230 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w"] Apr 24 21:56:09.489122 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.488559 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" containerID="cri-o://b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126" gracePeriod=30 Apr 24 21:56:09.489122 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.488637 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kube-rbac-proxy" containerID="cri-o://13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a" gracePeriod=30 Apr 24 21:56:09.614194 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614168 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs"] Apr 24 21:56:09.614473 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614460 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="storage-initializer" Apr 24 21:56:09.614517 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614475 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="storage-initializer" Apr 24 21:56:09.614517 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614506 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kube-rbac-proxy" Apr 24 21:56:09.614517 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614515 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kube-rbac-proxy" Apr 24 21:56:09.614614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614527 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" Apr 24 21:56:09.614614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614532 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" Apr 24 21:56:09.614614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614583 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kube-rbac-proxy" Apr 24 21:56:09.614614 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.614594 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="df063b6c-c298-404c-b8ca-4b4cc846593f" containerName="kserve-container" Apr 24 21:56:09.617392 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.617373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.619492 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.619472 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 24 21:56:09.619585 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.619519 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 21:56:09.629600 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.629575 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs"] Apr 24 21:56:09.733150 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.733115 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16596a46-9d58-4fa3-83c9-46f49d7b50a7-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.733315 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.733161 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16596a46-9d58-4fa3-83c9-46f49d7b50a7-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.733315 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.733200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wl46\" (UniqueName: \"kubernetes.io/projected/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kube-api-access-7wl46\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.733315 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.733222 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.834465 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.834388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16596a46-9d58-4fa3-83c9-46f49d7b50a7-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.834465 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.834438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16596a46-9d58-4fa3-83c9-46f49d7b50a7-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.834465 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.834467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wl46\" (UniqueName: \"kubernetes.io/projected/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kube-api-access-7wl46\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.834642 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.834488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.834959 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.834939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.835238 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.835219 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16596a46-9d58-4fa3-83c9-46f49d7b50a7-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.837053 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.837012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16596a46-9d58-4fa3-83c9-46f49d7b50a7-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.842873 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.842853 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wl46\" (UniqueName: \"kubernetes.io/projected/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kube-api-access-7wl46\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:09.929966 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:09.929942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:10.040722 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:10.040691 2570 generic.go:358] "Generic (PLEG): container finished" podID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerID="13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a" exitCode=2 Apr 24 21:56:10.040864 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:10.040760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerDied","Data":"13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a"} Apr 24 21:56:10.050469 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:10.050442 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs"] Apr 24 21:56:10.053318 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:56:10.053297 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16596a46_9d58_4fa3_83c9_46f49d7b50a7.slice/crio-a98e2d4afc85d0912ddcbf78c3b538370c2455958d76214ffe9f6502643e558b WatchSource:0}: Error finding container a98e2d4afc85d0912ddcbf78c3b538370c2455958d76214ffe9f6502643e558b: Status 404 returned error can't find the container with id a98e2d4afc85d0912ddcbf78c3b538370c2455958d76214ffe9f6502643e558b Apr 24 21:56:10.800729 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:10.800688 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 24 21:56:10.804972 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:10.804942 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 21:56:11.045002 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:11.044964 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerStarted","Data":"87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434"} Apr 24 21:56:11.045002 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:11.045002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerStarted","Data":"a98e2d4afc85d0912ddcbf78c3b538370c2455958d76214ffe9f6502643e558b"} Apr 24 21:56:12.820509 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.820489 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:56:12.857261 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.857230 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3971e441-4130-49c0-9fec-01afff2b0ce1-kserve-provision-location\") pod \"3971e441-4130-49c0-9fec-01afff2b0ce1\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " Apr 24 21:56:12.857399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.857264 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3971e441-4130-49c0-9fec-01afff2b0ce1-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"3971e441-4130-49c0-9fec-01afff2b0ce1\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " Apr 24 21:56:12.857399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.857283 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7kk4\" (UniqueName: \"kubernetes.io/projected/3971e441-4130-49c0-9fec-01afff2b0ce1-kube-api-access-q7kk4\") pod \"3971e441-4130-49c0-9fec-01afff2b0ce1\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " Apr 24 21:56:12.857399 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.857305 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3971e441-4130-49c0-9fec-01afff2b0ce1-proxy-tls\") pod \"3971e441-4130-49c0-9fec-01afff2b0ce1\" (UID: \"3971e441-4130-49c0-9fec-01afff2b0ce1\") " Apr 24 21:56:12.857612 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.857585 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3971e441-4130-49c0-9fec-01afff2b0ce1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3971e441-4130-49c0-9fec-01afff2b0ce1" (UID: "3971e441-4130-49c0-9fec-01afff2b0ce1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:12.857690 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.857608 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3971e441-4130-49c0-9fec-01afff2b0ce1-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "3971e441-4130-49c0-9fec-01afff2b0ce1" (UID: "3971e441-4130-49c0-9fec-01afff2b0ce1"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:12.859615 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.859587 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3971e441-4130-49c0-9fec-01afff2b0ce1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3971e441-4130-49c0-9fec-01afff2b0ce1" (UID: "3971e441-4130-49c0-9fec-01afff2b0ce1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:12.859615 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.859592 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3971e441-4130-49c0-9fec-01afff2b0ce1-kube-api-access-q7kk4" (OuterVolumeSpecName: "kube-api-access-q7kk4") pod "3971e441-4130-49c0-9fec-01afff2b0ce1" (UID: "3971e441-4130-49c0-9fec-01afff2b0ce1"). InnerVolumeSpecName "kube-api-access-q7kk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:12.957816 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.957785 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3971e441-4130-49c0-9fec-01afff2b0ce1-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:56:12.957816 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.957811 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3971e441-4130-49c0-9fec-01afff2b0ce1-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:56:12.958052 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.957825 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7kk4\" (UniqueName: \"kubernetes.io/projected/3971e441-4130-49c0-9fec-01afff2b0ce1-kube-api-access-q7kk4\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:56:12.958052 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:12.957840 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3971e441-4130-49c0-9fec-01afff2b0ce1-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.051571 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.051537 2570 generic.go:358] "Generic (PLEG): container finished" podID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerID="b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126" exitCode=0 Apr 24 21:56:13.051716 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.051605 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerDied","Data":"b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126"} Apr 24 21:56:13.051716 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.051633 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" Apr 24 21:56:13.051716 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.051641 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w" event={"ID":"3971e441-4130-49c0-9fec-01afff2b0ce1","Type":"ContainerDied","Data":"f7fd657c6570129e751f03bdae2d44becce6752d14e35bbd0f46101e9dc92aae"} Apr 24 21:56:13.051716 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.051659 2570 scope.go:117] "RemoveContainer" containerID="13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a" Apr 24 21:56:13.060207 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.060165 2570 scope.go:117] "RemoveContainer" containerID="b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126" Apr 24 21:56:13.067767 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.067748 2570 scope.go:117] "RemoveContainer" containerID="c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920" Apr 24 21:56:13.075208 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.075190 2570 scope.go:117] "RemoveContainer" containerID="13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a" Apr 24 21:56:13.075455 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:56:13.075437 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a\": container with ID starting with 13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a not found: ID does not exist" containerID="13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a" Apr 24 21:56:13.075537 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.075461 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a"} err="failed to get container status \"13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a\": rpc error: code = NotFound desc = could not find container \"13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a\": container with ID starting with 13c5db918eec23131a17da161cc92d9c0525979145dd7f5db74b9095761b264a not found: ID does not exist" Apr 24 21:56:13.075537 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.075477 2570 scope.go:117] "RemoveContainer" containerID="b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126" Apr 24 21:56:13.075715 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:56:13.075697 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126\": container with ID starting with b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126 not found: ID does not exist" containerID="b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126" Apr 24 21:56:13.075760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.075720 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126"} err="failed to get container status \"b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126\": rpc error: code = NotFound desc = could not find container \"b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126\": container with ID starting with b15d755d2501332f06ddf39446c71aa3827916b8cc1a9f421970e548f633e126 not found: ID does not exist" Apr 24 21:56:13.075760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.075749 2570 scope.go:117] "RemoveContainer" containerID="c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920" Apr 24 21:56:13.076086 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:56:13.076062 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920\": container with ID starting with c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920 not found: ID does not exist" containerID="c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920" Apr 24 21:56:13.076148 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.076094 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920"} err="failed to get container status \"c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920\": rpc error: code = NotFound desc = could not find container \"c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920\": container with ID starting with c42187a141df860e52e1037dc67bf1c507a6f504eafbe4d815ed78f1a460d920 not found: ID does not exist" Apr 24 21:56:13.084548 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.084527 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w"] Apr 24 21:56:13.089594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:13.089571 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-6vx4w"] Apr 24 21:56:14.056734 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:14.056690 2570 generic.go:358] "Generic (PLEG): container finished" podID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerID="87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434" exitCode=0 Apr 24 21:56:14.057052 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:14.056751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerDied","Data":"87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434"} Apr 24 21:56:14.067183 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:14.067156 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" path="/var/lib/kubelet/pods/3971e441-4130-49c0-9fec-01afff2b0ce1/volumes" Apr 24 21:56:15.061340 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:15.061308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerStarted","Data":"efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66"} Apr 24 21:56:15.061340 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:15.061344 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerStarted","Data":"e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17"} Apr 24 21:56:15.061801 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:15.061625 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:15.061801 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:15.061755 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:15.063041 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:15.063000 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:56:15.098751 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:15.098706 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podStartSLOduration=6.098694263 podStartE2EDuration="6.098694263s" podCreationTimestamp="2026-04-24 21:56:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:15.093511497 +0000 UTC m=+1727.517018904" watchObservedRunningTime="2026-04-24 21:56:15.098694263 +0000 UTC m=+1727.522201648" Apr 24 21:56:16.064821 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:16.064778 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:56:21.069923 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:21.069890 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:56:21.070426 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:21.070401 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:56:31.070617 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:31.070577 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:56:41.070634 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:41.070592 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:56:51.071249 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:56:51.071211 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:57:01.071309 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:01.071273 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:57:11.070461 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:11.070415 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:57:21.071373 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:21.071327 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:57:22.064113 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:22.064068 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:57:28.084480 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:28.084450 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:57:28.088258 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:28.088239 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 21:57:32.066470 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:32.066440 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:57:40.686418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.686387 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs"] Apr 24 21:57:40.686821 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.686702 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" containerID="cri-o://e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17" gracePeriod=30 Apr 24 21:57:40.686821 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.686760 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kube-rbac-proxy" containerID="cri-o://efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66" gracePeriod=30 Apr 24 21:57:40.825483 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825450 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7"] Apr 24 21:57:40.825740 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825727 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="storage-initializer" Apr 24 21:57:40.825785 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825742 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="storage-initializer" Apr 24 21:57:40.825785 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825749 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" Apr 24 21:57:40.825785 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825755 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" Apr 24 21:57:40.825785 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825775 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kube-rbac-proxy" Apr 24 21:57:40.825785 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825780 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kube-rbac-proxy" Apr 24 21:57:40.825936 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825822 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kube-rbac-proxy" Apr 24 21:57:40.825936 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.825832 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="3971e441-4130-49c0-9fec-01afff2b0ce1" containerName="kserve-container" Apr 24 21:57:40.828638 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.828622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:40.832653 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.832633 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-a2e41b-predictor-serving-cert\"" Apr 24 21:57:40.832765 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.832661 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-a2e41b-kube-rbac-proxy-sar-config\"" Apr 24 21:57:40.842926 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.842907 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7"] Apr 24 21:57:40.968110 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.968046 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67png\" (UniqueName: \"kubernetes.io/projected/9ffb3d24-cf41-4903-838d-f9853530ca89-kube-api-access-67png\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:40.968110 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.968107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ffb3d24-cf41-4903-838d-f9853530ca89-isvc-primary-a2e41b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:40.968263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.968159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ffb3d24-cf41-4903-838d-f9853530ca89-kserve-provision-location\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:40.968263 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:40.968217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ffb3d24-cf41-4903-838d-f9853530ca89-proxy-tls\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.065114 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.065084 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 24 21:57:41.069414 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.069392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ffb3d24-cf41-4903-838d-f9853530ca89-isvc-primary-a2e41b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.069502 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.069422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ffb3d24-cf41-4903-838d-f9853530ca89-kserve-provision-location\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.069502 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.069446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ffb3d24-cf41-4903-838d-f9853530ca89-proxy-tls\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.069596 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.069551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67png\" (UniqueName: \"kubernetes.io/projected/9ffb3d24-cf41-4903-838d-f9853530ca89-kube-api-access-67png\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.069817 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.069797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ffb3d24-cf41-4903-838d-f9853530ca89-kserve-provision-location\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.070141 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.070124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ffb3d24-cf41-4903-838d-f9853530ca89-isvc-primary-a2e41b-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.072052 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.072011 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ffb3d24-cf41-4903-838d-f9853530ca89-proxy-tls\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.082134 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.082115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67png\" (UniqueName: \"kubernetes.io/projected/9ffb3d24-cf41-4903-838d-f9853530ca89-kube-api-access-67png\") pod \"isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.138126 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.138100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:41.258170 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.258103 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7"] Apr 24 21:57:41.262638 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:57:41.262610 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ffb3d24_cf41_4903_838d_f9853530ca89.slice/crio-c20164f5d785fe389eb172e079301c324f9837347bfc2b7284149049cf34dcb9 WatchSource:0}: Error finding container c20164f5d785fe389eb172e079301c324f9837347bfc2b7284149049cf34dcb9: Status 404 returned error can't find the container with id c20164f5d785fe389eb172e079301c324f9837347bfc2b7284149049cf34dcb9 Apr 24 21:57:41.264382 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.264367 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:57:41.293199 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.293175 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerStarted","Data":"c20164f5d785fe389eb172e079301c324f9837347bfc2b7284149049cf34dcb9"} Apr 24 21:57:41.294857 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.294826 2570 generic.go:358] "Generic (PLEG): container finished" podID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerID="efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66" exitCode=2 Apr 24 21:57:41.294944 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:41.294892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerDied","Data":"efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66"} Apr 24 21:57:42.064418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:42.064380 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 21:57:42.298739 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:42.298699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerStarted","Data":"8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6"} Apr 24 21:57:44.025269 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.025248 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:57:44.091262 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.091195 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16596a46-9d58-4fa3-83c9-46f49d7b50a7-proxy-tls\") pod \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " Apr 24 21:57:44.091401 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.091284 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16596a46-9d58-4fa3-83c9-46f49d7b50a7-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " Apr 24 21:57:44.091401 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.091314 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wl46\" (UniqueName: \"kubernetes.io/projected/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kube-api-access-7wl46\") pod \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " Apr 24 21:57:44.091401 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.091355 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kserve-provision-location\") pod \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\" (UID: \"16596a46-9d58-4fa3-83c9-46f49d7b50a7\") " Apr 24 21:57:44.091650 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.091620 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16596a46-9d58-4fa3-83c9-46f49d7b50a7-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "16596a46-9d58-4fa3-83c9-46f49d7b50a7" (UID: "16596a46-9d58-4fa3-83c9-46f49d7b50a7"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:57:44.091701 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.091669 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "16596a46-9d58-4fa3-83c9-46f49d7b50a7" (UID: "16596a46-9d58-4fa3-83c9-46f49d7b50a7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:44.093351 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.093326 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16596a46-9d58-4fa3-83c9-46f49d7b50a7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "16596a46-9d58-4fa3-83c9-46f49d7b50a7" (UID: "16596a46-9d58-4fa3-83c9-46f49d7b50a7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:44.093479 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.093437 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kube-api-access-7wl46" (OuterVolumeSpecName: "kube-api-access-7wl46") pod "16596a46-9d58-4fa3-83c9-46f49d7b50a7" (UID: "16596a46-9d58-4fa3-83c9-46f49d7b50a7"). InnerVolumeSpecName "kube-api-access-7wl46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:44.191750 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.191725 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/16596a46-9d58-4fa3-83c9-46f49d7b50a7-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:57:44.191750 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.191747 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wl46\" (UniqueName: \"kubernetes.io/projected/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kube-api-access-7wl46\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:57:44.191884 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.191756 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16596a46-9d58-4fa3-83c9-46f49d7b50a7-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:57:44.191884 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.191767 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/16596a46-9d58-4fa3-83c9-46f49d7b50a7-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:57:44.306273 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.306236 2570 generic.go:358] "Generic (PLEG): container finished" podID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerID="e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17" exitCode=0 Apr 24 21:57:44.306378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.306272 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerDied","Data":"e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17"} Apr 24 21:57:44.306378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.306312 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" event={"ID":"16596a46-9d58-4fa3-83c9-46f49d7b50a7","Type":"ContainerDied","Data":"a98e2d4afc85d0912ddcbf78c3b538370c2455958d76214ffe9f6502643e558b"} Apr 24 21:57:44.306378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.306316 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs" Apr 24 21:57:44.306378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.306328 2570 scope.go:117] "RemoveContainer" containerID="efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66" Apr 24 21:57:44.315017 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.314995 2570 scope.go:117] "RemoveContainer" containerID="e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17" Apr 24 21:57:44.321829 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.321814 2570 scope.go:117] "RemoveContainer" containerID="87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434" Apr 24 21:57:44.328687 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.328670 2570 scope.go:117] "RemoveContainer" containerID="efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66" Apr 24 21:57:44.328747 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.328728 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs"] Apr 24 21:57:44.328964 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:57:44.328942 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66\": container with ID starting with efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66 not found: ID does not exist" containerID="efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66" Apr 24 21:57:44.329078 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.328971 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66"} err="failed to get container status \"efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66\": rpc error: code = NotFound desc = could not find container \"efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66\": container with ID starting with efe3a2477f972aded9f1e228199f641ad398913c502f55cb10072a6b369eae66 not found: ID does not exist" Apr 24 21:57:44.329078 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.328989 2570 scope.go:117] "RemoveContainer" containerID="e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17" Apr 24 21:57:44.329257 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:57:44.329243 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17\": container with ID starting with e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17 not found: ID does not exist" containerID="e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17" Apr 24 21:57:44.329296 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.329262 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17"} err="failed to get container status \"e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17\": rpc error: code = NotFound desc = could not find container \"e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17\": container with ID starting with e038650fa4a7dc54b2fb9ed6c36c219cac88766c899c05066daf180bdce1db17 not found: ID does not exist" Apr 24 21:57:44.329296 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.329275 2570 scope.go:117] "RemoveContainer" containerID="87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434" Apr 24 21:57:44.329533 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:57:44.329514 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434\": container with ID starting with 87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434 not found: ID does not exist" containerID="87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434" Apr 24 21:57:44.329576 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.329538 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434"} err="failed to get container status \"87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434\": rpc error: code = NotFound desc = could not find container \"87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434\": container with ID starting with 87caaec52297e8e0a1e2a6b530d5bc9d08a16a9595f5d2c3e84b5316f9822434 not found: ID does not exist" Apr 24 21:57:44.334760 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:44.334742 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-tkgcs"] Apr 24 21:57:45.314979 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:45.314893 2570 generic.go:358] "Generic (PLEG): container finished" podID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerID="8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6" exitCode=0 Apr 24 21:57:45.314979 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:45.314960 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerDied","Data":"8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6"} Apr 24 21:57:46.067044 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.067001 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" path="/var/lib/kubelet/pods/16596a46-9d58-4fa3-83c9-46f49d7b50a7/volumes" Apr 24 21:57:46.320771 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.320689 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerStarted","Data":"939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd"} Apr 24 21:57:46.320771 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.320728 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerStarted","Data":"324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c"} Apr 24 21:57:46.321190 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.321009 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:46.321190 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.321173 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:46.322342 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.322313 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:57:46.343175 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:46.343124 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podStartSLOduration=6.343088433 podStartE2EDuration="6.343088433s" podCreationTimestamp="2026-04-24 21:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:57:46.342631788 +0000 UTC m=+1818.766139194" watchObservedRunningTime="2026-04-24 21:57:46.343088433 +0000 UTC m=+1818.766595839" Apr 24 21:57:47.326546 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:47.326504 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:57:52.330417 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:52.330389 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:57:52.331077 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:57:52.331049 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:58:02.330988 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:58:02.330951 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:58:12.331698 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:58:12.331655 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:58:22.331475 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:58:22.331437 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:58:32.331893 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:58:32.331847 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:58:42.331378 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:58:42.331296 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:58:52.331722 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:58:52.331697 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:59:00.944695 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944667 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2"] Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944931 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944943 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944962 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kube-rbac-proxy" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944967 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kube-rbac-proxy" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944983 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="storage-initializer" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.944988 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="storage-initializer" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.945062 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kserve-container" Apr 24 21:59:00.945156 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.945069 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="16596a46-9d58-4fa3-83c9-46f49d7b50a7" containerName="kube-rbac-proxy" Apr 24 21:59:00.947984 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.947966 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:00.950166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.950148 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-a2e41b\"" Apr 24 21:59:00.950357 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.950344 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\"" Apr 24 21:59:00.950418 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.950377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-a2e41b-dockercfg-b4qcf\"" Apr 24 21:59:00.950499 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.950483 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-a2e41b-predictor-serving-cert\"" Apr 24 21:59:00.951164 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.951145 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 21:59:00.958640 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:00.958618 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2"] Apr 24 21:59:01.014416 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.014377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kserve-provision-location\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.014584 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.014428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-proxy-tls\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.014584 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.014464 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.014584 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.014488 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88ng\" (UniqueName: \"kubernetes.io/projected/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kube-api-access-d88ng\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.014584 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.014511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-cabundle-cert\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.115878 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.115841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kserve-provision-location\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116058 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.115907 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-proxy-tls\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116058 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.115938 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116058 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.115964 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d88ng\" (UniqueName: \"kubernetes.io/projected/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kube-api-access-d88ng\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.116242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-cabundle-cert\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116267 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.116264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kserve-provision-location\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116620 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.116600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.116744 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.116726 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-cabundle-cert\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.118525 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.118502 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-proxy-tls\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.125119 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.125099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88ng\" (UniqueName: \"kubernetes.io/projected/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kube-api-access-d88ng\") pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.258895 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.258832 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:01.382362 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.382332 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2"] Apr 24 21:59:01.385604 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:59:01.385571 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe2aa566_6ca3_43a4_b0ef_edf4f4b4279d.slice/crio-bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3 WatchSource:0}: Error finding container bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3: Status 404 returned error can't find the container with id bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3 Apr 24 21:59:01.523835 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.523750 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" event={"ID":"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d","Type":"ContainerStarted","Data":"b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd"} Apr 24 21:59:01.523835 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:01.523787 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" event={"ID":"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d","Type":"ContainerStarted","Data":"bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3"} Apr 24 21:59:07.541599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:07.541569 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/0.log" Apr 24 21:59:07.541958 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:07.541609 2570 generic.go:358] "Generic (PLEG): container finished" podID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerID="b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd" exitCode=1 Apr 24 21:59:07.541958 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:07.541636 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" event={"ID":"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d","Type":"ContainerDied","Data":"b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd"} Apr 24 21:59:08.545920 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:08.545892 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/0.log" Apr 24 21:59:08.546308 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:08.545986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" event={"ID":"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d","Type":"ContainerStarted","Data":"332a448e5faca80c06a815e9efc558c4a46305ffb28de47c8fc9448a8dc94e4c"} Apr 24 21:59:14.564616 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:14.564589 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/1.log" Apr 24 21:59:14.565043 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:14.564918 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/0.log" Apr 24 21:59:14.565043 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:14.564951 2570 generic.go:358] "Generic (PLEG): container finished" podID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerID="332a448e5faca80c06a815e9efc558c4a46305ffb28de47c8fc9448a8dc94e4c" exitCode=1 Apr 24 21:59:14.565137 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:14.565042 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" event={"ID":"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d","Type":"ContainerDied","Data":"332a448e5faca80c06a815e9efc558c4a46305ffb28de47c8fc9448a8dc94e4c"} Apr 24 21:59:14.565137 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:14.565082 2570 scope.go:117] "RemoveContainer" containerID="b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd" Apr 24 21:59:14.565505 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:14.565471 2570 scope.go:117] "RemoveContainer" containerID="b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd" Apr 24 21:59:14.575533 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:14.575502 2570 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_kserve-ci-e2e-test_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d_0 in pod sandbox bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3 from index: no such id: 'b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd'" containerID="b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd" Apr 24 21:59:14.575598 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:14.575550 2570 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_kserve-ci-e2e-test_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d_0 in pod sandbox bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3 from index: no such id: 'b5421b7ac3e708fa5ff197a073073454f1096b9cba5f158484bb215dc2c0e6bd'; Skipping pod \"isvc-secondary-a2e41b-predictor-7d46848944-55mm2_kserve-ci-e2e-test(be2aa566-6ca3-43a4-b0ef-edf4f4b4279d)\"" logger="UnhandledError" Apr 24 21:59:14.576841 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:14.576820 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-a2e41b-predictor-7d46848944-55mm2_kserve-ci-e2e-test(be2aa566-6ca3-43a4-b0ef-edf4f4b4279d)\"" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" Apr 24 21:59:15.568728 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:15.568703 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/1.log" Apr 24 21:59:19.019262 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.019232 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2"] Apr 24 21:59:19.091605 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.091507 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7"] Apr 24 21:59:19.091894 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.091866 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" containerID="cri-o://324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c" gracePeriod=30 Apr 24 21:59:19.091998 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.091936 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kube-rbac-proxy" containerID="cri-o://939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd" gracePeriod=30 Apr 24 21:59:19.178949 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.178919 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8"] Apr 24 21:59:19.183588 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.183571 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.186270 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.186253 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-2dfca0\"" Apr 24 21:59:19.186427 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.186399 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-2dfca0-dockercfg-jxh22\"" Apr 24 21:59:19.186894 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.186877 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-2dfca0-predictor-serving-cert\"" Apr 24 21:59:19.187317 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.187299 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\"" Apr 24 21:59:19.200890 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.200868 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8"] Apr 24 21:59:19.213608 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.213589 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/1.log" Apr 24 21:59:19.213712 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.213664 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:19.343065 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.342973 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kserve-provision-location\") pod \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " Apr 24 21:59:19.343065 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343008 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-cabundle-cert\") pod \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " Apr 24 21:59:19.343283 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343098 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-proxy-tls\") pod \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " Apr 24 21:59:19.343283 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343117 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88ng\" (UniqueName: \"kubernetes.io/projected/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kube-api-access-d88ng\") pod \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " Apr 24 21:59:19.343283 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343227 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\") pod \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\" (UID: \"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d\") " Apr 24 21:59:19.343446 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343287 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" (UID: "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:59:19.343446 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343388 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/045d6d29-e575-4065-ade3-f0731d04ffe1-kserve-provision-location\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.343446 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfgx\" (UniqueName: \"kubernetes.io/projected/045d6d29-e575-4065-ade3-f0731d04ffe1-kube-api-access-qmfgx\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.343594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.343594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343484 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" (UID: "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:19.343594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-cabundle-cert\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.343594 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/045d6d29-e575-4065-ade3-f0731d04ffe1-proxy-tls\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.343748 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343597 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-isvc-secondary-a2e41b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-a2e41b-kube-rbac-proxy-sar-config") pod "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" (UID: "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d"). InnerVolumeSpecName "isvc-secondary-a2e41b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:19.343748 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343649 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:19.343748 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.343671 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-cabundle-cert\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:19.345313 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.345295 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kube-api-access-d88ng" (OuterVolumeSpecName: "kube-api-access-d88ng") pod "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" (UID: "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d"). InnerVolumeSpecName "kube-api-access-d88ng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:59:19.345407 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.345394 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" (UID: "be2aa566-6ca3-43a4-b0ef-edf4f4b4279d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:19.444947 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.444917 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-cabundle-cert\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.444960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/045d6d29-e575-4065-ade3-f0731d04ffe1-proxy-tls\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.444994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/045d6d29-e575-4065-ade3-f0731d04ffe1-kserve-provision-location\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfgx\" (UniqueName: \"kubernetes.io/projected/045d6d29-e575-4065-ade3-f0731d04ffe1-kube-api-access-qmfgx\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445076 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:19.445098 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445091 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d88ng\" (UniqueName: \"kubernetes.io/projected/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-kube-api-access-d88ng\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:19.445424 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445105 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d-isvc-secondary-a2e41b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:19.445533 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445474 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/045d6d29-e575-4065-ade3-f0731d04ffe1-kserve-provision-location\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445595 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-cabundle-cert\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.445671 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.445654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.447335 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.447317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/045d6d29-e575-4065-ade3-f0731d04ffe1-proxy-tls\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.454356 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.454341 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfgx\" (UniqueName: \"kubernetes.io/projected/045d6d29-e575-4065-ade3-f0731d04ffe1-kube-api-access-qmfgx\") pod \"isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.494040 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.494004 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:19.581713 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.581691 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a2e41b-predictor-7d46848944-55mm2_be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/storage-initializer/1.log" Apr 24 21:59:19.581868 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.581825 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" event={"ID":"be2aa566-6ca3-43a4-b0ef-edf4f4b4279d","Type":"ContainerDied","Data":"bbe186a8cc2ad4e241a7986b824c7ce161d368a812f4845fc0728f35cb0f5cf3"} Apr 24 21:59:19.581868 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.581861 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2" Apr 24 21:59:19.581997 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.581894 2570 scope.go:117] "RemoveContainer" containerID="332a448e5faca80c06a815e9efc558c4a46305ffb28de47c8fc9448a8dc94e4c" Apr 24 21:59:19.584526 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.584495 2570 generic.go:358] "Generic (PLEG): container finished" podID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerID="939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd" exitCode=2 Apr 24 21:59:19.584647 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.584555 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerDied","Data":"939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd"} Apr 24 21:59:19.621357 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.621274 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8"] Apr 24 21:59:19.624205 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:59:19.624171 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045d6d29_e575_4065_ade3_f0731d04ffe1.slice/crio-189b8c6c5cd14c1da2333ee531abd643e61bfc7bfa8daba571b562864644258e WatchSource:0}: Error finding container 189b8c6c5cd14c1da2333ee531abd643e61bfc7bfa8daba571b562864644258e: Status 404 returned error can't find the container with id 189b8c6c5cd14c1da2333ee531abd643e61bfc7bfa8daba571b562864644258e Apr 24 21:59:19.625522 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.625497 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2"] Apr 24 21:59:19.629170 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:19.629150 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a2e41b-predictor-7d46848944-55mm2"] Apr 24 21:59:20.067771 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:20.067738 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" path="/var/lib/kubelet/pods/be2aa566-6ca3-43a4-b0ef-edf4f4b4279d/volumes" Apr 24 21:59:20.589295 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:20.589260 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" event={"ID":"045d6d29-e575-4065-ade3-f0731d04ffe1","Type":"ContainerStarted","Data":"4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400"} Apr 24 21:59:20.589295 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:20.589297 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" event={"ID":"045d6d29-e575-4065-ade3-f0731d04ffe1","Type":"ContainerStarted","Data":"189b8c6c5cd14c1da2333ee531abd643e61bfc7bfa8daba571b562864644258e"} Apr 24 21:59:22.326949 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:22.326900 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 24 21:59:22.331215 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:22.331189 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 21:59:23.223632 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.223612 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:59:23.374758 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.374696 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ffb3d24-cf41-4903-838d-f9853530ca89-proxy-tls\") pod \"9ffb3d24-cf41-4903-838d-f9853530ca89\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " Apr 24 21:59:23.374758 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.374732 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67png\" (UniqueName: \"kubernetes.io/projected/9ffb3d24-cf41-4903-838d-f9853530ca89-kube-api-access-67png\") pod \"9ffb3d24-cf41-4903-838d-f9853530ca89\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " Apr 24 21:59:23.374758 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.374759 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ffb3d24-cf41-4903-838d-f9853530ca89-kserve-provision-location\") pod \"9ffb3d24-cf41-4903-838d-f9853530ca89\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " Apr 24 21:59:23.375256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.374783 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ffb3d24-cf41-4903-838d-f9853530ca89-isvc-primary-a2e41b-kube-rbac-proxy-sar-config\") pod \"9ffb3d24-cf41-4903-838d-f9853530ca89\" (UID: \"9ffb3d24-cf41-4903-838d-f9853530ca89\") " Apr 24 21:59:23.375256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.375060 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ffb3d24-cf41-4903-838d-f9853530ca89-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ffb3d24-cf41-4903-838d-f9853530ca89" (UID: "9ffb3d24-cf41-4903-838d-f9853530ca89"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:59:23.375256 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.375127 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffb3d24-cf41-4903-838d-f9853530ca89-isvc-primary-a2e41b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-a2e41b-kube-rbac-proxy-sar-config") pod "9ffb3d24-cf41-4903-838d-f9853530ca89" (UID: "9ffb3d24-cf41-4903-838d-f9853530ca89"). InnerVolumeSpecName "isvc-primary-a2e41b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:23.376869 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.376850 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffb3d24-cf41-4903-838d-f9853530ca89-kube-api-access-67png" (OuterVolumeSpecName: "kube-api-access-67png") pod "9ffb3d24-cf41-4903-838d-f9853530ca89" (UID: "9ffb3d24-cf41-4903-838d-f9853530ca89"). InnerVolumeSpecName "kube-api-access-67png". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:59:23.376927 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.376896 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ffb3d24-cf41-4903-838d-f9853530ca89-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ffb3d24-cf41-4903-838d-f9853530ca89" (UID: "9ffb3d24-cf41-4903-838d-f9853530ca89"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:23.475535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.475515 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67png\" (UniqueName: \"kubernetes.io/projected/9ffb3d24-cf41-4903-838d-f9853530ca89-kube-api-access-67png\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:23.475535 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.475534 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ffb3d24-cf41-4903-838d-f9853530ca89-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:23.475664 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.475546 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-a2e41b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ffb3d24-cf41-4903-838d-f9853530ca89-isvc-primary-a2e41b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:23.475664 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.475555 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ffb3d24-cf41-4903-838d-f9853530ca89-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:23.598952 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.598916 2570 generic.go:358] "Generic (PLEG): container finished" podID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerID="324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c" exitCode=0 Apr 24 21:59:23.599076 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.598966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerDied","Data":"324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c"} Apr 24 21:59:23.599076 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.598988 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" event={"ID":"9ffb3d24-cf41-4903-838d-f9853530ca89","Type":"ContainerDied","Data":"c20164f5d785fe389eb172e079301c324f9837347bfc2b7284149049cf34dcb9"} Apr 24 21:59:23.599076 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.598995 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7" Apr 24 21:59:23.599076 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.599004 2570 scope.go:117] "RemoveContainer" containerID="939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd" Apr 24 21:59:23.607495 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.607474 2570 scope.go:117] "RemoveContainer" containerID="324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c" Apr 24 21:59:23.614426 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.614406 2570 scope.go:117] "RemoveContainer" containerID="8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6" Apr 24 21:59:23.620867 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.620851 2570 scope.go:117] "RemoveContainer" containerID="939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd" Apr 24 21:59:23.621142 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:23.621113 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd\": container with ID starting with 939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd not found: ID does not exist" containerID="939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd" Apr 24 21:59:23.621213 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.621150 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd"} err="failed to get container status \"939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd\": rpc error: code = NotFound desc = could not find container \"939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd\": container with ID starting with 939dc3990aeb64a7df187d16a4fc687c5c63da7b3d2b88696c6c9dc45ef5a9dd not found: ID does not exist" Apr 24 21:59:23.621213 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.621168 2570 scope.go:117] "RemoveContainer" containerID="324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c" Apr 24 21:59:23.621397 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:23.621382 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c\": container with ID starting with 324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c not found: ID does not exist" containerID="324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c" Apr 24 21:59:23.621438 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.621402 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c"} err="failed to get container status \"324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c\": rpc error: code = NotFound desc = could not find container \"324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c\": container with ID starting with 324a14841ac61bfa8e81e8fb97567458efc340fd21460e9b93a205a9fc58525c not found: ID does not exist" Apr 24 21:59:23.621438 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.621417 2570 scope.go:117] "RemoveContainer" containerID="8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6" Apr 24 21:59:23.621635 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:23.621618 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6\": container with ID starting with 8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6 not found: ID does not exist" containerID="8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6" Apr 24 21:59:23.621670 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.621643 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6"} err="failed to get container status \"8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6\": rpc error: code = NotFound desc = could not find container \"8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6\": container with ID starting with 8e919e50dcf4a536d253ca6cd4614aba50dfdfd1738ecd818097c58a0c2909b6 not found: ID does not exist" Apr 24 21:59:23.631273 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.631218 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7"] Apr 24 21:59:23.636601 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:23.636579 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a2e41b-predictor-d5d86bff5-pt5q7"] Apr 24 21:59:24.066430 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:24.066402 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" path="/var/lib/kubelet/pods/9ffb3d24-cf41-4903-838d-f9853530ca89/volumes" Apr 24 21:59:25.606166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:25.606094 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8_045d6d29-e575-4065-ade3-f0731d04ffe1/storage-initializer/0.log" Apr 24 21:59:25.606166 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:25.606129 2570 generic.go:358] "Generic (PLEG): container finished" podID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerID="4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400" exitCode=1 Apr 24 21:59:25.606530 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:25.606208 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" event={"ID":"045d6d29-e575-4065-ade3-f0731d04ffe1","Type":"ContainerDied","Data":"4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400"} Apr 24 21:59:26.610137 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:26.610113 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8_045d6d29-e575-4065-ade3-f0731d04ffe1/storage-initializer/0.log" Apr 24 21:59:26.610520 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:26.610209 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" event={"ID":"045d6d29-e575-4065-ade3-f0731d04ffe1","Type":"ContainerStarted","Data":"9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa"} Apr 24 21:59:29.156451 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.156420 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8"] Apr 24 21:59:29.156834 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.156680 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" containerID="cri-o://9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa" gracePeriod=30 Apr 24 21:59:29.302330 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.302297 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj"] Apr 24 21:59:29.303101 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303075 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerName="storage-initializer" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303105 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerName="storage-initializer" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303118 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kube-rbac-proxy" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303128 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kube-rbac-proxy" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303158 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="storage-initializer" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303167 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="storage-initializer" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303206 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303216 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" Apr 24 21:59:29.303224 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303224 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerName="storage-initializer" Apr 24 21:59:29.303592 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303234 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerName="storage-initializer" Apr 24 21:59:29.303592 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303399 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerName="storage-initializer" Apr 24 21:59:29.303592 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303419 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="be2aa566-6ca3-43a4-b0ef-edf4f4b4279d" containerName="storage-initializer" Apr 24 21:59:29.303592 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303431 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kserve-container" Apr 24 21:59:29.303592 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.303442 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ffb3d24-cf41-4903-838d-f9853530ca89" containerName="kube-rbac-proxy" Apr 24 21:59:29.308622 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.308601 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.314393 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.314356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 24 21:59:29.314393 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.314363 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 21:59:29.314570 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.314420 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qlzl5\"" Apr 24 21:59:29.322691 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.322669 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj"] Apr 24 21:59:29.418893 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.418867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02ef27f2-0b27-4923-b719-a8422b91bbd7-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.418979 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.418901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvjlr\" (UniqueName: \"kubernetes.io/projected/02ef27f2-0b27-4923-b719-a8422b91bbd7-kube-api-access-cvjlr\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.418979 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.418928 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02ef27f2-0b27-4923-b719-a8422b91bbd7-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.419133 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.419006 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ef27f2-0b27-4923-b719-a8422b91bbd7-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.519364 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.519337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02ef27f2-0b27-4923-b719-a8422b91bbd7-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.519491 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.519378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvjlr\" (UniqueName: \"kubernetes.io/projected/02ef27f2-0b27-4923-b719-a8422b91bbd7-kube-api-access-cvjlr\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.519491 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.519410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02ef27f2-0b27-4923-b719-a8422b91bbd7-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.519491 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.519453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ef27f2-0b27-4923-b719-a8422b91bbd7-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.519948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.519923 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ef27f2-0b27-4923-b719-a8422b91bbd7-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.520244 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.520197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02ef27f2-0b27-4923-b719-a8422b91bbd7-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.522118 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.522094 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02ef27f2-0b27-4923-b719-a8422b91bbd7-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.529517 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.529495 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvjlr\" (UniqueName: \"kubernetes.io/projected/02ef27f2-0b27-4923-b719-a8422b91bbd7-kube-api-access-cvjlr\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.618233 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.618213 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:29.742598 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:29.742558 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj"] Apr 24 21:59:29.745508 ip-10-0-129-230 kubenswrapper[2570]: W0424 21:59:29.745478 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ef27f2_0b27_4923_b719_a8422b91bbd7.slice/crio-2fd764ac68b7a9171251204d4199c4fbf34a40e70977614acba9b7eaf55ca51c WatchSource:0}: Error finding container 2fd764ac68b7a9171251204d4199c4fbf34a40e70977614acba9b7eaf55ca51c: Status 404 returned error can't find the container with id 2fd764ac68b7a9171251204d4199c4fbf34a40e70977614acba9b7eaf55ca51c Apr 24 21:59:30.293152 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.293120 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8_045d6d29-e575-4065-ade3-f0731d04ffe1/storage-initializer/1.log" Apr 24 21:59:30.293545 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.293527 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8_045d6d29-e575-4065-ade3-f0731d04ffe1/storage-initializer/0.log" Apr 24 21:59:30.293627 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.293614 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:30.424588 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.424563 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/045d6d29-e575-4065-ade3-f0731d04ffe1-kserve-provision-location\") pod \"045d6d29-e575-4065-ade3-f0731d04ffe1\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " Apr 24 21:59:30.424738 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.424611 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\") pod \"045d6d29-e575-4065-ade3-f0731d04ffe1\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " Apr 24 21:59:30.424738 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.424642 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmfgx\" (UniqueName: \"kubernetes.io/projected/045d6d29-e575-4065-ade3-f0731d04ffe1-kube-api-access-qmfgx\") pod \"045d6d29-e575-4065-ade3-f0731d04ffe1\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " Apr 24 21:59:30.424738 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.424674 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-cabundle-cert\") pod \"045d6d29-e575-4065-ade3-f0731d04ffe1\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " Apr 24 21:59:30.424905 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.424744 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/045d6d29-e575-4065-ade3-f0731d04ffe1-proxy-tls\") pod \"045d6d29-e575-4065-ade3-f0731d04ffe1\" (UID: \"045d6d29-e575-4065-ade3-f0731d04ffe1\") " Apr 24 21:59:30.424905 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.424858 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/045d6d29-e575-4065-ade3-f0731d04ffe1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "045d6d29-e575-4065-ade3-f0731d04ffe1" (UID: "045d6d29-e575-4065-ade3-f0731d04ffe1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:59:30.425078 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.425058 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "045d6d29-e575-4065-ade3-f0731d04ffe1" (UID: "045d6d29-e575-4065-ade3-f0731d04ffe1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:30.425078 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.425056 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config") pod "045d6d29-e575-4065-ade3-f0731d04ffe1" (UID: "045d6d29-e575-4065-ade3-f0731d04ffe1"). InnerVolumeSpecName "isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:30.426783 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.426766 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045d6d29-e575-4065-ade3-f0731d04ffe1-kube-api-access-qmfgx" (OuterVolumeSpecName: "kube-api-access-qmfgx") pod "045d6d29-e575-4065-ade3-f0731d04ffe1" (UID: "045d6d29-e575-4065-ade3-f0731d04ffe1"). InnerVolumeSpecName "kube-api-access-qmfgx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:59:30.426849 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.426830 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045d6d29-e575-4065-ade3-f0731d04ffe1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "045d6d29-e575-4065-ade3-f0731d04ffe1" (UID: "045d6d29-e575-4065-ade3-f0731d04ffe1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:30.525777 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.525752 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/045d6d29-e575-4065-ade3-f0731d04ffe1-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:30.525877 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.525782 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/045d6d29-e575-4065-ade3-f0731d04ffe1-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:30.525877 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.525798 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-isvc-init-fail-2dfca0-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:30.525877 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.525814 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmfgx\" (UniqueName: \"kubernetes.io/projected/045d6d29-e575-4065-ade3-f0731d04ffe1-kube-api-access-qmfgx\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:30.525877 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.525828 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/045d6d29-e575-4065-ade3-f0731d04ffe1-cabundle-cert\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 21:59:30.622969 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.622944 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8_045d6d29-e575-4065-ade3-f0731d04ffe1/storage-initializer/1.log" Apr 24 21:59:30.623334 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.623321 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8_045d6d29-e575-4065-ade3-f0731d04ffe1/storage-initializer/0.log" Apr 24 21:59:30.623396 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.623354 2570 generic.go:358] "Generic (PLEG): container finished" podID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerID="9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa" exitCode=1 Apr 24 21:59:30.623464 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.623444 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" Apr 24 21:59:30.623541 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.623442 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" event={"ID":"045d6d29-e575-4065-ade3-f0731d04ffe1","Type":"ContainerDied","Data":"9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa"} Apr 24 21:59:30.623599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.623568 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8" event={"ID":"045d6d29-e575-4065-ade3-f0731d04ffe1","Type":"ContainerDied","Data":"189b8c6c5cd14c1da2333ee531abd643e61bfc7bfa8daba571b562864644258e"} Apr 24 21:59:30.623599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.623584 2570 scope.go:117] "RemoveContainer" containerID="9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa" Apr 24 21:59:30.624967 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.624943 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerStarted","Data":"ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708"} Apr 24 21:59:30.625089 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.624975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerStarted","Data":"2fd764ac68b7a9171251204d4199c4fbf34a40e70977614acba9b7eaf55ca51c"} Apr 24 21:59:30.632048 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.632007 2570 scope.go:117] "RemoveContainer" containerID="4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400" Apr 24 21:59:30.638633 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.638618 2570 scope.go:117] "RemoveContainer" containerID="9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa" Apr 24 21:59:30.638860 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:30.638838 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa\": container with ID starting with 9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa not found: ID does not exist" containerID="9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa" Apr 24 21:59:30.638948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.638864 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa"} err="failed to get container status \"9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa\": rpc error: code = NotFound desc = could not find container \"9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa\": container with ID starting with 9fc6c6b159304dc55311f1284f2b9b5c35b4863dda22722c92c8e44396bacafa not found: ID does not exist" Apr 24 21:59:30.638948 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.638880 2570 scope.go:117] "RemoveContainer" containerID="4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400" Apr 24 21:59:30.639146 ip-10-0-129-230 kubenswrapper[2570]: E0424 21:59:30.639131 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400\": container with ID starting with 4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400 not found: ID does not exist" containerID="4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400" Apr 24 21:59:30.639204 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.639148 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400"} err="failed to get container status \"4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400\": rpc error: code = NotFound desc = could not find container \"4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400\": container with ID starting with 4cf087f8b3ce0883b927d84845b5623879f847ed7532f85ed47b19c3d6c95400 not found: ID does not exist" Apr 24 21:59:30.674993 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.674934 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8"] Apr 24 21:59:30.677630 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:30.677611 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-2dfca0-predictor-67f5956c66-mwsr8"] Apr 24 21:59:32.066639 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:32.066604 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" path="/var/lib/kubelet/pods/045d6d29-e575-4065-ade3-f0731d04ffe1/volumes" Apr 24 21:59:34.638599 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:34.638566 2570 generic.go:358] "Generic (PLEG): container finished" podID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerID="ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708" exitCode=0 Apr 24 21:59:34.639164 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:34.638619 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerDied","Data":"ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708"} Apr 24 21:59:51.693032 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:51.692987 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerStarted","Data":"13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b"} Apr 24 21:59:51.693429 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:51.693046 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerStarted","Data":"a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222"} Apr 24 21:59:51.693429 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:51.693262 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:51.721734 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:51.721669 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podStartSLOduration=5.948554339 podStartE2EDuration="22.721650646s" podCreationTimestamp="2026-04-24 21:59:29 +0000 UTC" firstStartedPulling="2026-04-24 21:59:34.639874267 +0000 UTC m=+1927.063381652" lastFinishedPulling="2026-04-24 21:59:51.41297057 +0000 UTC m=+1943.836477959" observedRunningTime="2026-04-24 21:59:51.719616291 +0000 UTC m=+1944.143123701" watchObservedRunningTime="2026-04-24 21:59:51.721650646 +0000 UTC m=+1944.145158054" Apr 24 21:59:52.695570 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:52.695540 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:52.696637 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:52.696609 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 21:59:53.698131 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:53.698088 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 21:59:58.701852 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:58.701822 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 21:59:58.702353 ip-10-0-129-230 kubenswrapper[2570]: I0424 21:59:58.702326 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:00:08.702679 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:00:08.702644 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:00:18.703114 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:00:18.703074 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:00:28.702340 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:00:28.702304 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:00:38.702948 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:00:38.702903 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:00:48.702689 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:00:48.702650 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:00:58.702652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:00:58.702612 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:01:02.066529 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:02.066504 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 22:01:09.432833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.432799 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj"] Apr 24 22:01:09.433400 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.433241 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" containerID="cri-o://a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222" gracePeriod=30 Apr 24 22:01:09.433400 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.433372 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kube-rbac-proxy" containerID="cri-o://13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b" gracePeriod=30 Apr 24 22:01:09.539937 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.539911 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t"] Apr 24 22:01:09.540222 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.540209 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" Apr 24 22:01:09.540276 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.540224 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" Apr 24 22:01:09.540276 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.540245 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" Apr 24 22:01:09.540276 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.540251 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" Apr 24 22:01:09.540392 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.540302 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" Apr 24 22:01:09.540392 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.540313 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="045d6d29-e575-4065-ade3-f0731d04ffe1" containerName="storage-initializer" Apr 24 22:01:09.548370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.548345 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.550840 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.550820 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 22:01:09.550956 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.550900 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 24 22:01:09.554416 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.554395 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t"] Apr 24 22:01:09.623992 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.623961 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0addd86a-728e-43d2-bd6e-7309e86f3b73-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.623992 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.623995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0addd86a-728e-43d2-bd6e-7309e86f3b73-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.624213 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.624042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0addd86a-728e-43d2-bd6e-7309e86f3b73-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.624213 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.624123 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdzkk\" (UniqueName: \"kubernetes.io/projected/0addd86a-728e-43d2-bd6e-7309e86f3b73-kube-api-access-tdzkk\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.724942 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.724866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0addd86a-728e-43d2-bd6e-7309e86f3b73-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.724942 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.724905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0addd86a-728e-43d2-bd6e-7309e86f3b73-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.724942 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.724937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0addd86a-728e-43d2-bd6e-7309e86f3b73-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.725234 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.724974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdzkk\" (UniqueName: \"kubernetes.io/projected/0addd86a-728e-43d2-bd6e-7309e86f3b73-kube-api-access-tdzkk\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.725435 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.725417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0addd86a-728e-43d2-bd6e-7309e86f3b73-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.725646 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.725621 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0addd86a-728e-43d2-bd6e-7309e86f3b73-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.727674 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.727655 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0addd86a-728e-43d2-bd6e-7309e86f3b73-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.742679 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.742654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdzkk\" (UniqueName: \"kubernetes.io/projected/0addd86a-728e-43d2-bd6e-7309e86f3b73-kube-api-access-tdzkk\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-np77t\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.859633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.859605 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:09.900008 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.899981 2570 generic.go:358] "Generic (PLEG): container finished" podID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerID="13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b" exitCode=2 Apr 24 22:01:09.900145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.900064 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerDied","Data":"13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b"} Apr 24 22:01:09.984813 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:09.984694 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t"] Apr 24 22:01:09.987396 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:01:09.987368 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0addd86a_728e_43d2_bd6e_7309e86f3b73.slice/crio-694c104b611e726321baefbccd775efe90160e6c7704a5bc2434eb93692878ee WatchSource:0}: Error finding container 694c104b611e726321baefbccd775efe90160e6c7704a5bc2434eb93692878ee: Status 404 returned error can't find the container with id 694c104b611e726321baefbccd775efe90160e6c7704a5bc2434eb93692878ee Apr 24 22:01:10.904812 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:10.904775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerStarted","Data":"03c65f89c08d77020952feec961fb5c7fafd919f16ca72740efc5ad234ffdb7f"} Apr 24 22:01:10.904812 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:10.904816 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerStarted","Data":"694c104b611e726321baefbccd775efe90160e6c7704a5bc2434eb93692878ee"} Apr 24 22:01:12.063364 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:12.063325 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 22:01:13.698924 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.698877 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 24 22:01:13.895582 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.895562 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 22:01:13.913732 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.913710 2570 generic.go:358] "Generic (PLEG): container finished" podID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerID="a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222" exitCode=0 Apr 24 22:01:13.913836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.913773 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" Apr 24 22:01:13.913836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.913818 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerDied","Data":"a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222"} Apr 24 22:01:13.913954 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.913852 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj" event={"ID":"02ef27f2-0b27-4923-b719-a8422b91bbd7","Type":"ContainerDied","Data":"2fd764ac68b7a9171251204d4199c4fbf34a40e70977614acba9b7eaf55ca51c"} Apr 24 22:01:13.913954 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.913869 2570 scope.go:117] "RemoveContainer" containerID="13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b" Apr 24 22:01:13.915949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.915358 2570 generic.go:358] "Generic (PLEG): container finished" podID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerID="03c65f89c08d77020952feec961fb5c7fafd919f16ca72740efc5ad234ffdb7f" exitCode=0 Apr 24 22:01:13.915949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.915434 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerDied","Data":"03c65f89c08d77020952feec961fb5c7fafd919f16ca72740efc5ad234ffdb7f"} Apr 24 22:01:13.922915 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.922893 2570 scope.go:117] "RemoveContainer" containerID="a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222" Apr 24 22:01:13.931639 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.931620 2570 scope.go:117] "RemoveContainer" containerID="ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708" Apr 24 22:01:13.940885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.940851 2570 scope.go:117] "RemoveContainer" containerID="13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b" Apr 24 22:01:13.941299 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:01:13.941278 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b\": container with ID starting with 13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b not found: ID does not exist" containerID="13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b" Apr 24 22:01:13.941354 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.941313 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b"} err="failed to get container status \"13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b\": rpc error: code = NotFound desc = could not find container \"13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b\": container with ID starting with 13767f6fc8314a9b62c579427fa15243a43a588e8737174c529c83ffa807f01b not found: ID does not exist" Apr 24 22:01:13.941354 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.941337 2570 scope.go:117] "RemoveContainer" containerID="a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222" Apr 24 22:01:13.941601 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:01:13.941584 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222\": container with ID starting with a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222 not found: ID does not exist" containerID="a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222" Apr 24 22:01:13.941677 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.941611 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222"} err="failed to get container status \"a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222\": rpc error: code = NotFound desc = could not find container \"a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222\": container with ID starting with a1852b0a3adc9dbad24aa259ca7e3741a8db470ec265b4079812acfde6c42222 not found: ID does not exist" Apr 24 22:01:13.941677 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.941633 2570 scope.go:117] "RemoveContainer" containerID="ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708" Apr 24 22:01:13.941881 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:01:13.941859 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708\": container with ID starting with ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708 not found: ID does not exist" containerID="ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708" Apr 24 22:01:13.941926 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:13.941890 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708"} err="failed to get container status \"ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708\": rpc error: code = NotFound desc = could not find container \"ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708\": container with ID starting with ec0d9a8ceda0c259436cdc80ffe07238f37aaff330a434e24655c3021ca60708 not found: ID does not exist" Apr 24 22:01:14.055824 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.055793 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02ef27f2-0b27-4923-b719-a8422b91bbd7-proxy-tls\") pod \"02ef27f2-0b27-4923-b719-a8422b91bbd7\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " Apr 24 22:01:14.055925 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.055836 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02ef27f2-0b27-4923-b719-a8422b91bbd7-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"02ef27f2-0b27-4923-b719-a8422b91bbd7\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " Apr 24 22:01:14.055925 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.055854 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ef27f2-0b27-4923-b719-a8422b91bbd7-kserve-provision-location\") pod \"02ef27f2-0b27-4923-b719-a8422b91bbd7\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " Apr 24 22:01:14.055925 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.055873 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvjlr\" (UniqueName: \"kubernetes.io/projected/02ef27f2-0b27-4923-b719-a8422b91bbd7-kube-api-access-cvjlr\") pod \"02ef27f2-0b27-4923-b719-a8422b91bbd7\" (UID: \"02ef27f2-0b27-4923-b719-a8422b91bbd7\") " Apr 24 22:01:14.056231 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.056190 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ef27f2-0b27-4923-b719-a8422b91bbd7-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "02ef27f2-0b27-4923-b719-a8422b91bbd7" (UID: "02ef27f2-0b27-4923-b719-a8422b91bbd7"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:01:14.056231 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.056208 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ef27f2-0b27-4923-b719-a8422b91bbd7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "02ef27f2-0b27-4923-b719-a8422b91bbd7" (UID: "02ef27f2-0b27-4923-b719-a8422b91bbd7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:14.058466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.058447 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02ef27f2-0b27-4923-b719-a8422b91bbd7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "02ef27f2-0b27-4923-b719-a8422b91bbd7" (UID: "02ef27f2-0b27-4923-b719-a8422b91bbd7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:01:14.058554 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.058478 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ef27f2-0b27-4923-b719-a8422b91bbd7-kube-api-access-cvjlr" (OuterVolumeSpecName: "kube-api-access-cvjlr") pod "02ef27f2-0b27-4923-b719-a8422b91bbd7" (UID: "02ef27f2-0b27-4923-b719-a8422b91bbd7"). InnerVolumeSpecName "kube-api-access-cvjlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:01:14.156881 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.156849 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/02ef27f2-0b27-4923-b719-a8422b91bbd7-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:01:14.157003 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.156884 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/02ef27f2-0b27-4923-b719-a8422b91bbd7-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:01:14.157003 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.156899 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02ef27f2-0b27-4923-b719-a8422b91bbd7-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:01:14.157003 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.156910 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvjlr\" (UniqueName: \"kubernetes.io/projected/02ef27f2-0b27-4923-b719-a8422b91bbd7-kube-api-access-cvjlr\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:01:14.229438 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.229373 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj"] Apr 24 22:01:14.232855 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.232830 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-c9fwj"] Apr 24 22:01:14.921258 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.921222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerStarted","Data":"f05d4d842875a1dd489a73fc7deb79911b3985ebad79ee8393e508a633694b10"} Apr 24 22:01:14.921258 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.921263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerStarted","Data":"f94fd16c5c2bad65e8e5599c4867ed5d60a454129e8d4605f379ad88bda65a5a"} Apr 24 22:01:14.921706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.921492 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:14.940124 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:14.940080 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podStartSLOduration=5.9400650729999995 podStartE2EDuration="5.940065073s" podCreationTimestamp="2026-04-24 22:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:01:14.939239736 +0000 UTC m=+2027.362747136" watchObservedRunningTime="2026-04-24 22:01:14.940065073 +0000 UTC m=+2027.363572480" Apr 24 22:01:15.925428 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:15.925392 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:15.926687 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:15.926658 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:01:16.066741 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:16.066712 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" path="/var/lib/kubelet/pods/02ef27f2-0b27-4923-b719-a8422b91bbd7/volumes" Apr 24 22:01:16.928962 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:16.928919 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:01:21.933091 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:21.933063 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:01:21.933584 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:21.933557 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:01:31.933756 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:31.933709 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:01:41.934256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:41.934174 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:01:51.933769 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:01:51.933727 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:02:01.933758 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:01.933718 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:02:11.933530 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:11.933492 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:02:21.934415 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:21.934375 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 22:02:28.103048 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:28.102997 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:02:28.107553 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:28.107534 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:02:29.063192 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.063160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:02:29.620388 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.620359 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t"] Apr 24 22:02:29.710998 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.710961 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7"] Apr 24 22:02:29.711264 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711248 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" Apr 24 22:02:29.711264 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711262 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" Apr 24 22:02:29.711264 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711277 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kube-rbac-proxy" Apr 24 22:02:29.711416 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711285 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kube-rbac-proxy" Apr 24 22:02:29.711416 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711310 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="storage-initializer" Apr 24 22:02:29.711416 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711317 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="storage-initializer" Apr 24 22:02:29.711416 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711362 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kserve-container" Apr 24 22:02:29.711416 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.711376 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="02ef27f2-0b27-4923-b719-a8422b91bbd7" containerName="kube-rbac-proxy" Apr 24 22:02:29.714337 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.714321 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.716587 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.716561 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 22:02:29.717273 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.717233 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 24 22:02:29.726161 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.726142 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7"] Apr 24 22:02:29.789203 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.789181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knh7h\" (UniqueName: \"kubernetes.io/projected/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kube-api-access-knh7h\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.789334 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.789213 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.789334 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.789235 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.789334 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.789307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.889911 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.889837 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.889911 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.889903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knh7h\" (UniqueName: \"kubernetes.io/projected/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kube-api-access-knh7h\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.890165 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.889930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.890165 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.889952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.890381 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.890346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.890553 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.890531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.892569 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.892551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:29.898267 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:29.898248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knh7h\" (UniqueName: \"kubernetes.io/projected/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kube-api-access-knh7h\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:30.030967 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:30.030940 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:30.132495 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:30.132463 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kube-rbac-proxy" containerID="cri-o://f05d4d842875a1dd489a73fc7deb79911b3985ebad79ee8393e508a633694b10" gracePeriod=30 Apr 24 22:02:30.132645 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:30.132444 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" containerID="cri-o://f94fd16c5c2bad65e8e5599c4867ed5d60a454129e8d4605f379ad88bda65a5a" gracePeriod=30 Apr 24 22:02:30.150449 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:30.150352 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7"] Apr 24 22:02:30.152985 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:02:30.152962 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3cc6e0_ac57_4f49_b550_1c0666d4fac2.slice/crio-5ff706979a4de902b581e4671f55e7986262892d1190e1423ad1b7e7ee7b2ab9 WatchSource:0}: Error finding container 5ff706979a4de902b581e4671f55e7986262892d1190e1423ad1b7e7ee7b2ab9: Status 404 returned error can't find the container with id 5ff706979a4de902b581e4671f55e7986262892d1190e1423ad1b7e7ee7b2ab9 Apr 24 22:02:31.136658 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:31.136614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerStarted","Data":"4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6"} Apr 24 22:02:31.136658 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:31.136661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerStarted","Data":"5ff706979a4de902b581e4671f55e7986262892d1190e1423ad1b7e7ee7b2ab9"} Apr 24 22:02:31.138434 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:31.138408 2570 generic.go:358] "Generic (PLEG): container finished" podID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerID="f05d4d842875a1dd489a73fc7deb79911b3985ebad79ee8393e508a633694b10" exitCode=2 Apr 24 22:02:31.138535 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:31.138465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerDied","Data":"f05d4d842875a1dd489a73fc7deb79911b3985ebad79ee8393e508a633694b10"} Apr 24 22:02:31.929678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:31.929632 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.41:8643/healthz\": dial tcp 10.133.0.41:8643: connect: connection refused" Apr 24 22:02:34.146788 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.146758 2570 generic.go:358] "Generic (PLEG): container finished" podID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerID="4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6" exitCode=0 Apr 24 22:02:34.147129 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.146827 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerDied","Data":"4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6"} Apr 24 22:02:34.148843 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.148787 2570 generic.go:358] "Generic (PLEG): container finished" podID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerID="f94fd16c5c2bad65e8e5599c4867ed5d60a454129e8d4605f379ad88bda65a5a" exitCode=0 Apr 24 22:02:34.148927 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.148860 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerDied","Data":"f94fd16c5c2bad65e8e5599c4867ed5d60a454129e8d4605f379ad88bda65a5a"} Apr 24 22:02:34.278699 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.278677 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:02:34.420394 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.420359 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdzkk\" (UniqueName: \"kubernetes.io/projected/0addd86a-728e-43d2-bd6e-7309e86f3b73-kube-api-access-tdzkk\") pod \"0addd86a-728e-43d2-bd6e-7309e86f3b73\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " Apr 24 22:02:34.420561 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.420436 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0addd86a-728e-43d2-bd6e-7309e86f3b73-kserve-provision-location\") pod \"0addd86a-728e-43d2-bd6e-7309e86f3b73\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " Apr 24 22:02:34.420561 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.420476 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0addd86a-728e-43d2-bd6e-7309e86f3b73-proxy-tls\") pod \"0addd86a-728e-43d2-bd6e-7309e86f3b73\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " Apr 24 22:02:34.420561 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.420522 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0addd86a-728e-43d2-bd6e-7309e86f3b73-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"0addd86a-728e-43d2-bd6e-7309e86f3b73\" (UID: \"0addd86a-728e-43d2-bd6e-7309e86f3b73\") " Apr 24 22:02:34.420796 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.420772 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0addd86a-728e-43d2-bd6e-7309e86f3b73-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0addd86a-728e-43d2-bd6e-7309e86f3b73" (UID: "0addd86a-728e-43d2-bd6e-7309e86f3b73"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:02:34.421002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.420979 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0addd86a-728e-43d2-bd6e-7309e86f3b73-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "0addd86a-728e-43d2-bd6e-7309e86f3b73" (UID: "0addd86a-728e-43d2-bd6e-7309e86f3b73"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:02:34.423469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.423442 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0addd86a-728e-43d2-bd6e-7309e86f3b73-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0addd86a-728e-43d2-bd6e-7309e86f3b73" (UID: "0addd86a-728e-43d2-bd6e-7309e86f3b73"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:02:34.423578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.423533 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0addd86a-728e-43d2-bd6e-7309e86f3b73-kube-api-access-tdzkk" (OuterVolumeSpecName: "kube-api-access-tdzkk") pod "0addd86a-728e-43d2-bd6e-7309e86f3b73" (UID: "0addd86a-728e-43d2-bd6e-7309e86f3b73"). InnerVolumeSpecName "kube-api-access-tdzkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:02:34.521361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.521305 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0addd86a-728e-43d2-bd6e-7309e86f3b73-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:02:34.521361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.521328 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdzkk\" (UniqueName: \"kubernetes.io/projected/0addd86a-728e-43d2-bd6e-7309e86f3b73-kube-api-access-tdzkk\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:02:34.521361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.521338 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0addd86a-728e-43d2-bd6e-7309e86f3b73-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:02:34.521361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:34.521347 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0addd86a-728e-43d2-bd6e-7309e86f3b73-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:02:35.152870 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.152834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerStarted","Data":"fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb"} Apr 24 22:02:35.153317 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.152882 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerStarted","Data":"bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92"} Apr 24 22:02:35.153317 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.153187 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:35.153317 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.153301 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:35.154524 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.154499 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:02:35.154660 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.154576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" event={"ID":"0addd86a-728e-43d2-bd6e-7309e86f3b73","Type":"ContainerDied","Data":"694c104b611e726321baefbccd775efe90160e6c7704a5bc2434eb93692878ee"} Apr 24 22:02:35.154660 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.154594 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t" Apr 24 22:02:35.154660 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.154615 2570 scope.go:117] "RemoveContainer" containerID="f05d4d842875a1dd489a73fc7deb79911b3985ebad79ee8393e508a633694b10" Apr 24 22:02:35.162718 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.162700 2570 scope.go:117] "RemoveContainer" containerID="f94fd16c5c2bad65e8e5599c4867ed5d60a454129e8d4605f379ad88bda65a5a" Apr 24 22:02:35.169752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.169735 2570 scope.go:117] "RemoveContainer" containerID="03c65f89c08d77020952feec961fb5c7fafd919f16ca72740efc5ad234ffdb7f" Apr 24 22:02:35.176726 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.176693 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podStartSLOduration=6.176678818 podStartE2EDuration="6.176678818s" podCreationTimestamp="2026-04-24 22:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:02:35.175286593 +0000 UTC m=+2107.598794001" watchObservedRunningTime="2026-04-24 22:02:35.176678818 +0000 UTC m=+2107.600186229" Apr 24 22:02:35.190593 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.190568 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t"] Apr 24 22:02:35.194326 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:35.194308 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-np77t"] Apr 24 22:02:36.066589 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:36.066555 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" path="/var/lib/kubelet/pods/0addd86a-728e-43d2-bd6e-7309e86f3b73/volumes" Apr 24 22:02:36.158136 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:36.158103 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:02:41.162571 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:41.162523 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:02:41.163336 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:41.163294 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:02:51.163555 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:02:51.163518 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:03:01.163530 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:01.163489 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:03:11.164105 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:11.164000 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:03:21.163686 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:21.163647 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:03:31.163472 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:31.163435 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:03:41.163510 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:41.163471 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:03:51.163887 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:51.163858 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:03:59.860979 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.860950 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7"] Apr 24 22:03:59.861415 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.861267 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" containerID="cri-o://bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92" gracePeriod=30 Apr 24 22:03:59.861415 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.861327 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kube-rbac-proxy" containerID="cri-o://fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb" gracePeriod=30 Apr 24 22:03:59.976455 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976426 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2"] Apr 24 22:03:59.976706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976693 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" Apr 24 22:03:59.976760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976708 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" Apr 24 22:03:59.976760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976718 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kube-rbac-proxy" Apr 24 22:03:59.976760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976725 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kube-rbac-proxy" Apr 24 22:03:59.976760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976745 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="storage-initializer" Apr 24 22:03:59.976760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976750 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="storage-initializer" Apr 24 22:03:59.976932 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976800 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kserve-container" Apr 24 22:03:59.976932 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.976809 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0addd86a-728e-43d2-bd6e-7309e86f3b73" containerName="kube-rbac-proxy" Apr 24 22:03:59.979574 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.979558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:03:59.981997 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.981978 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 24 22:03:59.982078 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.981987 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:03:59.989455 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.989436 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2"] Apr 24 22:03:59.998294 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.998275 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdhp\" (UniqueName: \"kubernetes.io/projected/4213cbe4-d664-4878-b787-f9a20fb231ff-kube-api-access-vcdhp\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:03:59.998369 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.998326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4213cbe4-d664-4878-b787-f9a20fb231ff-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:03:59.998409 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.998369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4213cbe4-d664-4878-b787-f9a20fb231ff-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:03:59.998409 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:03:59.998391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4213cbe4-d664-4878-b787-f9a20fb231ff-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.098790 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.098762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdhp\" (UniqueName: \"kubernetes.io/projected/4213cbe4-d664-4878-b787-f9a20fb231ff-kube-api-access-vcdhp\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.098972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.098828 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4213cbe4-d664-4878-b787-f9a20fb231ff-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.098972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.098865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4213cbe4-d664-4878-b787-f9a20fb231ff-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.098972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.098889 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4213cbe4-d664-4878-b787-f9a20fb231ff-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.099217 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.099197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4213cbe4-d664-4878-b787-f9a20fb231ff-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.099486 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.099466 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4213cbe4-d664-4878-b787-f9a20fb231ff-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.101458 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.101436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4213cbe4-d664-4878-b787-f9a20fb231ff-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.109345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.109315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdhp\" (UniqueName: \"kubernetes.io/projected/4213cbe4-d664-4878-b787-f9a20fb231ff-kube-api-access-vcdhp\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.290128 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.290089 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:00.385281 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.385248 2570 generic.go:358] "Generic (PLEG): container finished" podID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerID="fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb" exitCode=2 Apr 24 22:04:00.385429 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.385350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerDied","Data":"fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb"} Apr 24 22:04:00.421318 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.421289 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2"] Apr 24 22:04:00.424511 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:04:00.424486 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4213cbe4_d664_4878_b787_f9a20fb231ff.slice/crio-ffcbd6e6e00d131b1949b837196960a426980c2a7b988a9a589e07f784efb461 WatchSource:0}: Error finding container ffcbd6e6e00d131b1949b837196960a426980c2a7b988a9a589e07f784efb461: Status 404 returned error can't find the container with id ffcbd6e6e00d131b1949b837196960a426980c2a7b988a9a589e07f784efb461 Apr 24 22:04:00.426199 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:00.426179 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:04:01.159048 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:01.158985 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 24 22:04:01.164323 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:01.164297 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 22:04:01.389478 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:01.389437 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerStarted","Data":"d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1"} Apr 24 22:04:01.389478 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:01.389475 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerStarted","Data":"ffcbd6e6e00d131b1949b837196960a426980c2a7b988a9a589e07f784efb461"} Apr 24 22:04:04.399002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:04.398970 2570 generic.go:358] "Generic (PLEG): container finished" podID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerID="d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1" exitCode=0 Apr 24 22:04:04.399331 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:04.399047 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerDied","Data":"d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1"} Apr 24 22:04:04.991950 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:04.991928 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:04:05.030930 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.030903 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knh7h\" (UniqueName: \"kubernetes.io/projected/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kube-api-access-knh7h\") pod \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " Apr 24 22:04:05.030930 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.030933 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " Apr 24 22:04:05.031139 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.030986 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-proxy-tls\") pod \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " Apr 24 22:04:05.031139 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.031013 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kserve-provision-location\") pod \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\" (UID: \"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2\") " Apr 24 22:04:05.031379 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.031354 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" (UID: "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:04:05.031463 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.031393 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" (UID: "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:04:05.033115 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.033095 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kube-api-access-knh7h" (OuterVolumeSpecName: "kube-api-access-knh7h") pod "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" (UID: "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2"). InnerVolumeSpecName "kube-api-access-knh7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:04:05.033220 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.033208 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" (UID: "5d3cc6e0-ac57-4f49-b550-1c0666d4fac2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:04:05.132469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.132403 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-knh7h\" (UniqueName: \"kubernetes.io/projected/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kube-api-access-knh7h\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:04:05.132469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.132434 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:04:05.132469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.132451 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:04:05.132469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.132465 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:04:05.404538 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.404455 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerStarted","Data":"30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125"} Apr 24 22:04:05.404538 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.404501 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerStarted","Data":"6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb"} Apr 24 22:04:05.404981 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.404736 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:05.404981 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.404759 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:05.406106 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.406081 2570 generic.go:358] "Generic (PLEG): container finished" podID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerID="bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92" exitCode=0 Apr 24 22:04:05.406223 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.406129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerDied","Data":"bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92"} Apr 24 22:04:05.406223 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.406155 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" event={"ID":"5d3cc6e0-ac57-4f49-b550-1c0666d4fac2","Type":"ContainerDied","Data":"5ff706979a4de902b581e4671f55e7986262892d1190e1423ad1b7e7ee7b2ab9"} Apr 24 22:04:05.406223 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.406160 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7" Apr 24 22:04:05.406223 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.406175 2570 scope.go:117] "RemoveContainer" containerID="fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb" Apr 24 22:04:05.414425 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.414416 2570 scope.go:117] "RemoveContainer" containerID="bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92" Apr 24 22:04:05.421194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.421176 2570 scope.go:117] "RemoveContainer" containerID="4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6" Apr 24 22:04:05.430562 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.430523 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podStartSLOduration=6.430511311 podStartE2EDuration="6.430511311s" podCreationTimestamp="2026-04-24 22:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:04:05.42858262 +0000 UTC m=+2197.852090028" watchObservedRunningTime="2026-04-24 22:04:05.430511311 +0000 UTC m=+2197.854018717" Apr 24 22:04:05.430894 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.430871 2570 scope.go:117] "RemoveContainer" containerID="fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb" Apr 24 22:04:05.431203 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:04:05.431181 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb\": container with ID starting with fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb not found: ID does not exist" containerID="fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb" Apr 24 22:04:05.431283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.431212 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb"} err="failed to get container status \"fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb\": rpc error: code = NotFound desc = could not find container \"fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb\": container with ID starting with fefa530ecffa0ae262c50fddad4d5ea9abfd748f455e4a8533b3cd73ad7229cb not found: ID does not exist" Apr 24 22:04:05.431283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.431231 2570 scope.go:117] "RemoveContainer" containerID="bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92" Apr 24 22:04:05.431466 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:04:05.431447 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92\": container with ID starting with bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92 not found: ID does not exist" containerID="bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92" Apr 24 22:04:05.431507 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.431472 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92"} err="failed to get container status \"bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92\": rpc error: code = NotFound desc = could not find container \"bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92\": container with ID starting with bb2f65613a10599f899f9219ea864afdc54b46529404574080fded16d0105a92 not found: ID does not exist" Apr 24 22:04:05.431507 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.431488 2570 scope.go:117] "RemoveContainer" containerID="4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6" Apr 24 22:04:05.431827 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:04:05.431806 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6\": container with ID starting with 4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6 not found: ID does not exist" containerID="4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6" Apr 24 22:04:05.431903 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.431833 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6"} err="failed to get container status \"4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6\": rpc error: code = NotFound desc = could not find container \"4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6\": container with ID starting with 4770e537733dda89ad16466284c77c3bbe639d77f81049d53e34bd00428fb0c6 not found: ID does not exist" Apr 24 22:04:05.447767 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.447699 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7"] Apr 24 22:04:05.453209 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:05.453181 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-zz2h7"] Apr 24 22:04:06.067362 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:06.067331 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" path="/var/lib/kubelet/pods/5d3cc6e0-ac57-4f49-b550-1c0666d4fac2/volumes" Apr 24 22:04:11.415386 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:11.415359 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:04:41.416665 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:41.416575 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 22:04:51.416283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:04:51.416237 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 22:05:01.416534 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:01.416491 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 22:05:11.416315 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:11.416271 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 22:05:21.418960 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:21.418932 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:05:30.071253 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.071180 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2"] Apr 24 22:05:30.071945 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.071696 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" containerID="cri-o://6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb" gracePeriod=30 Apr 24 22:05:30.071945 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.071794 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kube-rbac-proxy" containerID="cri-o://30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125" gracePeriod=30 Apr 24 22:05:30.167120 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167088 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql"] Apr 24 22:05:30.167420 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167399 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="storage-initializer" Apr 24 22:05:30.167420 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167415 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="storage-initializer" Apr 24 22:05:30.167537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167426 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kube-rbac-proxy" Apr 24 22:05:30.167537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167432 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kube-rbac-proxy" Apr 24 22:05:30.167537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167450 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" Apr 24 22:05:30.167537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167457 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" Apr 24 22:05:30.167537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167500 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kube-rbac-proxy" Apr 24 22:05:30.167537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.167514 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3cc6e0-ac57-4f49-b550-1c0666d4fac2" containerName="kserve-container" Apr 24 22:05:30.170495 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.170478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.172465 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.172434 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 24 22:05:30.172562 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.172502 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:05:30.180266 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.180248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql"] Apr 24 22:05:30.317627 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.317587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqpq6\" (UniqueName: \"kubernetes.io/projected/105fc865-51ba-4add-81a7-bf81e72bd348-kube-api-access-jqpq6\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.317783 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.317639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/105fc865-51ba-4add-81a7-bf81e72bd348-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.317783 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.317697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fc865-51ba-4add-81a7-bf81e72bd348-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.317783 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.317734 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105fc865-51ba-4add-81a7-bf81e72bd348-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.418341 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.418306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fc865-51ba-4add-81a7-bf81e72bd348-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.418452 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.418359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105fc865-51ba-4add-81a7-bf81e72bd348-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.418452 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.418385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqpq6\" (UniqueName: \"kubernetes.io/projected/105fc865-51ba-4add-81a7-bf81e72bd348-kube-api-access-jqpq6\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.418452 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.418409 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/105fc865-51ba-4add-81a7-bf81e72bd348-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.418749 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.418733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/105fc865-51ba-4add-81a7-bf81e72bd348-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.419117 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.419091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105fc865-51ba-4add-81a7-bf81e72bd348-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.420952 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.420934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fc865-51ba-4add-81a7-bf81e72bd348-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.428997 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.428964 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqpq6\" (UniqueName: \"kubernetes.io/projected/105fc865-51ba-4add-81a7-bf81e72bd348-kube-api-access-jqpq6\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.480953 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.480936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:30.596295 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.596271 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql"] Apr 24 22:05:30.598642 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:05:30.598613 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod105fc865_51ba_4add_81a7_bf81e72bd348.slice/crio-5c17abe110f463b465bc85112a2dbf2c1cad202f3f3a9b0e03396217cb5d943d WatchSource:0}: Error finding container 5c17abe110f463b465bc85112a2dbf2c1cad202f3f3a9b0e03396217cb5d943d: Status 404 returned error can't find the container with id 5c17abe110f463b465bc85112a2dbf2c1cad202f3f3a9b0e03396217cb5d943d Apr 24 22:05:30.626760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.626729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerStarted","Data":"5c17abe110f463b465bc85112a2dbf2c1cad202f3f3a9b0e03396217cb5d943d"} Apr 24 22:05:30.628430 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.628409 2570 generic.go:358] "Generic (PLEG): container finished" podID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerID="30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125" exitCode=2 Apr 24 22:05:30.628515 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:30.628453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerDied","Data":"30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125"} Apr 24 22:05:31.410933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:31.410888 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 24 22:05:31.416190 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:31.416161 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 22:05:31.631927 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:31.631894 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerStarted","Data":"00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d"} Apr 24 22:05:34.311147 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.311124 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:05:34.445464 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.445431 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcdhp\" (UniqueName: \"kubernetes.io/projected/4213cbe4-d664-4878-b787-f9a20fb231ff-kube-api-access-vcdhp\") pod \"4213cbe4-d664-4878-b787-f9a20fb231ff\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " Apr 24 22:05:34.445652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.445480 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4213cbe4-d664-4878-b787-f9a20fb231ff-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"4213cbe4-d664-4878-b787-f9a20fb231ff\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " Apr 24 22:05:34.445652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.445528 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4213cbe4-d664-4878-b787-f9a20fb231ff-proxy-tls\") pod \"4213cbe4-d664-4878-b787-f9a20fb231ff\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " Apr 24 22:05:34.445652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.445560 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4213cbe4-d664-4878-b787-f9a20fb231ff-kserve-provision-location\") pod \"4213cbe4-d664-4878-b787-f9a20fb231ff\" (UID: \"4213cbe4-d664-4878-b787-f9a20fb231ff\") " Apr 24 22:05:34.445900 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.445875 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4213cbe4-d664-4878-b787-f9a20fb231ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4213cbe4-d664-4878-b787-f9a20fb231ff" (UID: "4213cbe4-d664-4878-b787-f9a20fb231ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:34.445963 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.445901 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4213cbe4-d664-4878-b787-f9a20fb231ff-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "4213cbe4-d664-4878-b787-f9a20fb231ff" (UID: "4213cbe4-d664-4878-b787-f9a20fb231ff"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:05:34.447630 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.447606 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4213cbe4-d664-4878-b787-f9a20fb231ff-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4213cbe4-d664-4878-b787-f9a20fb231ff" (UID: "4213cbe4-d664-4878-b787-f9a20fb231ff"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:34.447735 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.447690 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4213cbe4-d664-4878-b787-f9a20fb231ff-kube-api-access-vcdhp" (OuterVolumeSpecName: "kube-api-access-vcdhp") pod "4213cbe4-d664-4878-b787-f9a20fb231ff" (UID: "4213cbe4-d664-4878-b787-f9a20fb231ff"). InnerVolumeSpecName "kube-api-access-vcdhp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:34.546277 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.546251 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vcdhp\" (UniqueName: \"kubernetes.io/projected/4213cbe4-d664-4878-b787-f9a20fb231ff-kube-api-access-vcdhp\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:05:34.546277 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.546273 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4213cbe4-d664-4878-b787-f9a20fb231ff-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:05:34.546398 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.546287 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4213cbe4-d664-4878-b787-f9a20fb231ff-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:05:34.546398 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.546299 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4213cbe4-d664-4878-b787-f9a20fb231ff-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:05:34.640222 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.640200 2570 generic.go:358] "Generic (PLEG): container finished" podID="105fc865-51ba-4add-81a7-bf81e72bd348" containerID="00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d" exitCode=0 Apr 24 22:05:34.640317 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.640273 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerDied","Data":"00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d"} Apr 24 22:05:34.646551 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.646526 2570 generic.go:358] "Generic (PLEG): container finished" podID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerID="6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb" exitCode=0 Apr 24 22:05:34.646632 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.646565 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerDied","Data":"6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb"} Apr 24 22:05:34.646681 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.646675 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" event={"ID":"4213cbe4-d664-4878-b787-f9a20fb231ff","Type":"ContainerDied","Data":"ffcbd6e6e00d131b1949b837196960a426980c2a7b988a9a589e07f784efb461"} Apr 24 22:05:34.646733 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.646694 2570 scope.go:117] "RemoveContainer" containerID="30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125" Apr 24 22:05:34.646782 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.646757 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2" Apr 24 22:05:34.656244 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.656229 2570 scope.go:117] "RemoveContainer" containerID="6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb" Apr 24 22:05:34.663833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.663817 2570 scope.go:117] "RemoveContainer" containerID="d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1" Apr 24 22:05:34.673941 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.673919 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2"] Apr 24 22:05:34.676450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.676432 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-rmsq2"] Apr 24 22:05:34.684108 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.684090 2570 scope.go:117] "RemoveContainer" containerID="30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125" Apr 24 22:05:34.684367 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:05:34.684347 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125\": container with ID starting with 30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125 not found: ID does not exist" containerID="30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125" Apr 24 22:05:34.684450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.684377 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125"} err="failed to get container status \"30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125\": rpc error: code = NotFound desc = could not find container \"30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125\": container with ID starting with 30728501d254e504c228cb633a3d0cfa7deee061e8907391b451f7f7f7b42125 not found: ID does not exist" Apr 24 22:05:34.684450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.684401 2570 scope.go:117] "RemoveContainer" containerID="6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb" Apr 24 22:05:34.684667 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:05:34.684650 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb\": container with ID starting with 6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb not found: ID does not exist" containerID="6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb" Apr 24 22:05:34.684706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.684673 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb"} err="failed to get container status \"6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb\": rpc error: code = NotFound desc = could not find container \"6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb\": container with ID starting with 6cce2c3296c4ee1ae80bffe9faf2b59c99a88e377ad1450b7950d415498530eb not found: ID does not exist" Apr 24 22:05:34.684706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.684689 2570 scope.go:117] "RemoveContainer" containerID="d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1" Apr 24 22:05:34.684929 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:05:34.684905 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1\": container with ID starting with d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1 not found: ID does not exist" containerID="d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1" Apr 24 22:05:34.685048 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:34.684940 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1"} err="failed to get container status \"d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1\": rpc error: code = NotFound desc = could not find container \"d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1\": container with ID starting with d8a5676a3f5da9e0c8af5ba747e506aaf2fbdaf3442d47d1eacd76986535c4c1 not found: ID does not exist" Apr 24 22:05:35.651398 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:35.651367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerStarted","Data":"eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5"} Apr 24 22:05:35.651398 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:35.651403 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerStarted","Data":"3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9"} Apr 24 22:05:35.651800 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:35.651605 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:35.651800 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:35.651661 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:05:35.670909 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:35.670866 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podStartSLOduration=5.670855151 podStartE2EDuration="5.670855151s" podCreationTimestamp="2026-04-24 22:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:05:35.670301081 +0000 UTC m=+2288.093808498" watchObservedRunningTime="2026-04-24 22:05:35.670855151 +0000 UTC m=+2288.094362557" Apr 24 22:05:36.066486 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:36.066456 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" path="/var/lib/kubelet/pods/4213cbe4-d664-4878-b787-f9a20fb231ff/volumes" Apr 24 22:05:41.659627 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:05:41.659598 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:06:11.660386 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:11.660306 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 22:06:21.660081 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:21.660045 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 22:06:31.661102 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:31.661057 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 22:06:41.663499 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:41.663470 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:06:50.253219 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.253185 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql"] Apr 24 22:06:50.253671 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.253571 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" containerID="cri-o://3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9" gracePeriod=30 Apr 24 22:06:50.253748 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.253646 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kube-rbac-proxy" containerID="cri-o://eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5" gracePeriod=30 Apr 24 22:06:50.323238 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323207 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw"] Apr 24 22:06:50.323480 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323468 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" Apr 24 22:06:50.323531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323483 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" Apr 24 22:06:50.323531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323494 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kube-rbac-proxy" Apr 24 22:06:50.323531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323500 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kube-rbac-proxy" Apr 24 22:06:50.323531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323513 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="storage-initializer" Apr 24 22:06:50.323531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323519 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="storage-initializer" Apr 24 22:06:50.323682 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323560 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kube-rbac-proxy" Apr 24 22:06:50.323682 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.323568 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4213cbe4-d664-4878-b787-f9a20fb231ff" containerName="kserve-container" Apr 24 22:06:50.327755 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.327740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.330397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.330375 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:06:50.330397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.330377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 24 22:06:50.337419 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.337400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw"] Apr 24 22:06:50.344679 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.344659 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5abd18b1-6567-498c-b4d1-444e34d775a5-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.344769 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.344703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hbk6\" (UniqueName: \"kubernetes.io/projected/5abd18b1-6567-498c-b4d1-444e34d775a5-kube-api-access-4hbk6\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.344810 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.344765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5abd18b1-6567-498c-b4d1-444e34d775a5-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.344810 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.344791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5abd18b1-6567-498c-b4d1-444e34d775a5-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.445884 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.445861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5abd18b1-6567-498c-b4d1-444e34d775a5-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.445983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.445893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5abd18b1-6567-498c-b4d1-444e34d775a5-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.445983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.445926 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hbk6\" (UniqueName: \"kubernetes.io/projected/5abd18b1-6567-498c-b4d1-444e34d775a5-kube-api-access-4hbk6\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.445983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.445972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5abd18b1-6567-498c-b4d1-444e34d775a5-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.446348 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.446320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5abd18b1-6567-498c-b4d1-444e34d775a5-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.446496 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.446477 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5abd18b1-6567-498c-b4d1-444e34d775a5-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.448285 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.448265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5abd18b1-6567-498c-b4d1-444e34d775a5-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.454841 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.454818 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hbk6\" (UniqueName: \"kubernetes.io/projected/5abd18b1-6567-498c-b4d1-444e34d775a5-kube-api-access-4hbk6\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.638324 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.638249 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:50.764373 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.764339 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw"] Apr 24 22:06:50.768256 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:06:50.768228 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5abd18b1_6567_498c_b4d1_444e34d775a5.slice/crio-ab137b3ac7180839adbc78045cffb8d35947d42681d2610952fd79ebfebea2ec WatchSource:0}: Error finding container ab137b3ac7180839adbc78045cffb8d35947d42681d2610952fd79ebfebea2ec: Status 404 returned error can't find the container with id ab137b3ac7180839adbc78045cffb8d35947d42681d2610952fd79ebfebea2ec Apr 24 22:06:50.858949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.858914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerStarted","Data":"aa47481d368d53b9ef3d4868a45e9e190e519fc4fe126b4f2a71d2f68dadbd81"} Apr 24 22:06:50.859093 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.858956 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerStarted","Data":"ab137b3ac7180839adbc78045cffb8d35947d42681d2610952fd79ebfebea2ec"} Apr 24 22:06:50.860811 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.860780 2570 generic.go:358] "Generic (PLEG): container finished" podID="105fc865-51ba-4add-81a7-bf81e72bd348" containerID="eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5" exitCode=2 Apr 24 22:06:50.860911 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:50.860858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerDied","Data":"eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5"} Apr 24 22:06:51.654463 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:51.654418 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 24 22:06:51.660814 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:51.660783 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 22:06:54.391340 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.391317 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:06:54.471454 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471401 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fc865-51ba-4add-81a7-bf81e72bd348-proxy-tls\") pod \"105fc865-51ba-4add-81a7-bf81e72bd348\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " Apr 24 22:06:54.471454 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471438 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105fc865-51ba-4add-81a7-bf81e72bd348-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"105fc865-51ba-4add-81a7-bf81e72bd348\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " Apr 24 22:06:54.471632 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471469 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/105fc865-51ba-4add-81a7-bf81e72bd348-kserve-provision-location\") pod \"105fc865-51ba-4add-81a7-bf81e72bd348\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " Apr 24 22:06:54.471632 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471591 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqpq6\" (UniqueName: \"kubernetes.io/projected/105fc865-51ba-4add-81a7-bf81e72bd348-kube-api-access-jqpq6\") pod \"105fc865-51ba-4add-81a7-bf81e72bd348\" (UID: \"105fc865-51ba-4add-81a7-bf81e72bd348\") " Apr 24 22:06:54.471774 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471754 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/105fc865-51ba-4add-81a7-bf81e72bd348-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "105fc865-51ba-4add-81a7-bf81e72bd348" (UID: "105fc865-51ba-4add-81a7-bf81e72bd348"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:06:54.471826 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471774 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/105fc865-51ba-4add-81a7-bf81e72bd348-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "105fc865-51ba-4add-81a7-bf81e72bd348" (UID: "105fc865-51ba-4add-81a7-bf81e72bd348"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:06:54.471884 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471831 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/105fc865-51ba-4add-81a7-bf81e72bd348-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:06:54.471884 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.471852 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/105fc865-51ba-4add-81a7-bf81e72bd348-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:06:54.473611 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.473583 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/105fc865-51ba-4add-81a7-bf81e72bd348-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "105fc865-51ba-4add-81a7-bf81e72bd348" (UID: "105fc865-51ba-4add-81a7-bf81e72bd348"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:06:54.473704 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.473672 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/105fc865-51ba-4add-81a7-bf81e72bd348-kube-api-access-jqpq6" (OuterVolumeSpecName: "kube-api-access-jqpq6") pod "105fc865-51ba-4add-81a7-bf81e72bd348" (UID: "105fc865-51ba-4add-81a7-bf81e72bd348"). InnerVolumeSpecName "kube-api-access-jqpq6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:06:54.572788 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.572765 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/105fc865-51ba-4add-81a7-bf81e72bd348-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:06:54.572788 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.572785 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqpq6\" (UniqueName: \"kubernetes.io/projected/105fc865-51ba-4add-81a7-bf81e72bd348-kube-api-access-jqpq6\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:06:54.872938 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.872868 2570 generic.go:358] "Generic (PLEG): container finished" podID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerID="aa47481d368d53b9ef3d4868a45e9e190e519fc4fe126b4f2a71d2f68dadbd81" exitCode=0 Apr 24 22:06:54.873076 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.872943 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerDied","Data":"aa47481d368d53b9ef3d4868a45e9e190e519fc4fe126b4f2a71d2f68dadbd81"} Apr 24 22:06:54.874677 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.874657 2570 generic.go:358] "Generic (PLEG): container finished" podID="105fc865-51ba-4add-81a7-bf81e72bd348" containerID="3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9" exitCode=0 Apr 24 22:06:54.874775 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.874729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerDied","Data":"3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9"} Apr 24 22:06:54.874775 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.874740 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" Apr 24 22:06:54.874775 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.874759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql" event={"ID":"105fc865-51ba-4add-81a7-bf81e72bd348","Type":"ContainerDied","Data":"5c17abe110f463b465bc85112a2dbf2c1cad202f3f3a9b0e03396217cb5d943d"} Apr 24 22:06:54.874897 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.874778 2570 scope.go:117] "RemoveContainer" containerID="eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5" Apr 24 22:06:54.882921 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.882855 2570 scope.go:117] "RemoveContainer" containerID="3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9" Apr 24 22:06:54.889921 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.889899 2570 scope.go:117] "RemoveContainer" containerID="00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d" Apr 24 22:06:54.897392 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.897373 2570 scope.go:117] "RemoveContainer" containerID="eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5" Apr 24 22:06:54.897648 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:06:54.897631 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5\": container with ID starting with eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5 not found: ID does not exist" containerID="eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5" Apr 24 22:06:54.897696 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.897656 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5"} err="failed to get container status \"eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5\": rpc error: code = NotFound desc = could not find container \"eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5\": container with ID starting with eb244b09e612dece7b368c70ab16ddb55e1f151316d2f5185a90e03788118ee5 not found: ID does not exist" Apr 24 22:06:54.897696 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.897673 2570 scope.go:117] "RemoveContainer" containerID="3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9" Apr 24 22:06:54.897910 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:06:54.897890 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9\": container with ID starting with 3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9 not found: ID does not exist" containerID="3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9" Apr 24 22:06:54.897952 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.897921 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9"} err="failed to get container status \"3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9\": rpc error: code = NotFound desc = could not find container \"3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9\": container with ID starting with 3dea5109aa917ea20a0a7973cacdcdebfe039da66ee2468055b80770188c8ba9 not found: ID does not exist" Apr 24 22:06:54.897952 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.897946 2570 scope.go:117] "RemoveContainer" containerID="00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d" Apr 24 22:06:54.898237 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:06:54.898213 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d\": container with ID starting with 00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d not found: ID does not exist" containerID="00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d" Apr 24 22:06:54.898342 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.898242 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d"} err="failed to get container status \"00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d\": rpc error: code = NotFound desc = could not find container \"00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d\": container with ID starting with 00d6fd5b9b5c62498fd1c6cfb5a6774edec80034eaa6ca97f80127d23a8ddd5d not found: ID does not exist" Apr 24 22:06:54.903752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.903733 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql"] Apr 24 22:06:54.907809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:54.907788 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-bxhql"] Apr 24 22:06:55.879573 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:55.879545 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerStarted","Data":"3dbf525410d214c6fdc30b136716582d8e6b07b798198f3c82c4ef7bf4ae177a"} Apr 24 22:06:55.879933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:55.879580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerStarted","Data":"ec2b189cb0f587540b2f7b3fb508d81e10e3ef3424fcc352b32263a38ffaf1ec"} Apr 24 22:06:55.879933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:55.879807 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:55.879933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:55.879849 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:06:55.901214 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:55.901158 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podStartSLOduration=5.90114499 podStartE2EDuration="5.90114499s" podCreationTimestamp="2026-04-24 22:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:06:55.899290563 +0000 UTC m=+2368.322797968" watchObservedRunningTime="2026-04-24 22:06:55.90114499 +0000 UTC m=+2368.324652395" Apr 24 22:06:56.066758 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:06:56.066721 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" path="/var/lib/kubelet/pods/105fc865-51ba-4add-81a7-bf81e72bd348/volumes" Apr 24 22:07:01.888087 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:07:01.888061 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:07:28.121011 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:07:28.120978 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:07:28.129353 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:07:28.129330 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:07:31.889399 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:07:31.889313 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 22:07:41.889116 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:07:41.889017 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 22:07:51.888998 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:07:51.888958 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 22:08:01.889284 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:01.889245 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 22:08:11.891872 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:11.891841 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:08:20.436938 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:20.436900 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw"] Apr 24 22:08:20.437422 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:20.437264 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" containerID="cri-o://ec2b189cb0f587540b2f7b3fb508d81e10e3ef3424fcc352b32263a38ffaf1ec" gracePeriod=30 Apr 24 22:08:20.437422 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:20.437308 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kube-rbac-proxy" containerID="cri-o://3dbf525410d214c6fdc30b136716582d8e6b07b798198f3c82c4ef7bf4ae177a" gracePeriod=30 Apr 24 22:08:21.112987 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:21.112957 2570 generic.go:358] "Generic (PLEG): container finished" podID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerID="3dbf525410d214c6fdc30b136716582d8e6b07b798198f3c82c4ef7bf4ae177a" exitCode=2 Apr 24 22:08:21.113193 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:21.113006 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerDied","Data":"3dbf525410d214c6fdc30b136716582d8e6b07b798198f3c82c4ef7bf4ae177a"} Apr 24 22:08:21.882908 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:21.882863 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.45:8643/healthz\": dial tcp 10.133.0.45:8643: connect: connection refused" Apr 24 22:08:21.889278 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:21.889249 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 22:08:22.607961 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.607931 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk"] Apr 24 22:08:22.608220 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608208 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" Apr 24 22:08:22.608220 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608221 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" Apr 24 22:08:22.608327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608231 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kube-rbac-proxy" Apr 24 22:08:22.608327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608237 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kube-rbac-proxy" Apr 24 22:08:22.608327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608254 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="storage-initializer" Apr 24 22:08:22.608327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608261 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="storage-initializer" Apr 24 22:08:22.608327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608306 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kube-rbac-proxy" Apr 24 22:08:22.608327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.608313 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="105fc865-51ba-4add-81a7-bf81e72bd348" containerName="kserve-container" Apr 24 22:08:22.611188 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.611171 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.613345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.613326 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 22:08:22.613436 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.613326 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 24 22:08:22.622597 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.622572 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk"] Apr 24 22:08:22.698804 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.698763 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kserve-provision-location\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.698948 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.698809 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9h44\" (UniqueName: \"kubernetes.io/projected/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kube-api-access-t9h44\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.698948 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.698886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.699058 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.698957 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-proxy-tls\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.799721 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.799688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kserve-provision-location\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.799721 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.799723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9h44\" (UniqueName: \"kubernetes.io/projected/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kube-api-access-t9h44\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.799949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.799759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.799949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.799802 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-proxy-tls\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.800160 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.800143 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kserve-provision-location\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.800490 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.800457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.802302 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.802284 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-proxy-tls\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.808132 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.808096 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9h44\" (UniqueName: \"kubernetes.io/projected/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kube-api-access-t9h44\") pod \"isvc-sklearn-predictor-9d8cff754-hq5lk\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:22.921797 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:22.921763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:23.048390 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:23.048365 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk"] Apr 24 22:08:23.050983 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:08:23.050956 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ea4aca_2de6_4eb0_bb0d_dc2f5f09dce0.slice/crio-73a16430b92fcd1607534eebd8f9d15da208dc805d7a63a33006012cf5e203f0 WatchSource:0}: Error finding container 73a16430b92fcd1607534eebd8f9d15da208dc805d7a63a33006012cf5e203f0: Status 404 returned error can't find the container with id 73a16430b92fcd1607534eebd8f9d15da208dc805d7a63a33006012cf5e203f0 Apr 24 22:08:23.119889 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:23.119847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerStarted","Data":"b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9"} Apr 24 22:08:23.119998 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:23.119901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerStarted","Data":"73a16430b92fcd1607534eebd8f9d15da208dc805d7a63a33006012cf5e203f0"} Apr 24 22:08:26.131260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.131227 2570 generic.go:358] "Generic (PLEG): container finished" podID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerID="ec2b189cb0f587540b2f7b3fb508d81e10e3ef3424fcc352b32263a38ffaf1ec" exitCode=0 Apr 24 22:08:26.131627 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.131284 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerDied","Data":"ec2b189cb0f587540b2f7b3fb508d81e10e3ef3424fcc352b32263a38ffaf1ec"} Apr 24 22:08:26.267437 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.267412 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:08:26.322659 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.322629 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5abd18b1-6567-498c-b4d1-444e34d775a5-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"5abd18b1-6567-498c-b4d1-444e34d775a5\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " Apr 24 22:08:26.322834 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.322670 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5abd18b1-6567-498c-b4d1-444e34d775a5-proxy-tls\") pod \"5abd18b1-6567-498c-b4d1-444e34d775a5\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " Apr 24 22:08:26.322834 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.322703 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5abd18b1-6567-498c-b4d1-444e34d775a5-kserve-provision-location\") pod \"5abd18b1-6567-498c-b4d1-444e34d775a5\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " Apr 24 22:08:26.322834 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.322725 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hbk6\" (UniqueName: \"kubernetes.io/projected/5abd18b1-6567-498c-b4d1-444e34d775a5-kube-api-access-4hbk6\") pod \"5abd18b1-6567-498c-b4d1-444e34d775a5\" (UID: \"5abd18b1-6567-498c-b4d1-444e34d775a5\") " Apr 24 22:08:26.323101 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.323076 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5abd18b1-6567-498c-b4d1-444e34d775a5-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "5abd18b1-6567-498c-b4d1-444e34d775a5" (UID: "5abd18b1-6567-498c-b4d1-444e34d775a5"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:08:26.323172 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.323098 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5abd18b1-6567-498c-b4d1-444e34d775a5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5abd18b1-6567-498c-b4d1-444e34d775a5" (UID: "5abd18b1-6567-498c-b4d1-444e34d775a5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:26.324937 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.324918 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5abd18b1-6567-498c-b4d1-444e34d775a5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5abd18b1-6567-498c-b4d1-444e34d775a5" (UID: "5abd18b1-6567-498c-b4d1-444e34d775a5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:26.325051 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.325002 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5abd18b1-6567-498c-b4d1-444e34d775a5-kube-api-access-4hbk6" (OuterVolumeSpecName: "kube-api-access-4hbk6") pod "5abd18b1-6567-498c-b4d1-444e34d775a5" (UID: "5abd18b1-6567-498c-b4d1-444e34d775a5"). InnerVolumeSpecName "kube-api-access-4hbk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:08:26.423206 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.423175 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5abd18b1-6567-498c-b4d1-444e34d775a5-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:08:26.423206 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.423205 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5abd18b1-6567-498c-b4d1-444e34d775a5-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:08:26.423406 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.423217 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5abd18b1-6567-498c-b4d1-444e34d775a5-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:08:26.423406 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:26.423229 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hbk6\" (UniqueName: \"kubernetes.io/projected/5abd18b1-6567-498c-b4d1-444e34d775a5-kube-api-access-4hbk6\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:08:27.136058 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.135931 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" event={"ID":"5abd18b1-6567-498c-b4d1-444e34d775a5","Type":"ContainerDied","Data":"ab137b3ac7180839adbc78045cffb8d35947d42681d2610952fd79ebfebea2ec"} Apr 24 22:08:27.136058 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.135970 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw" Apr 24 22:08:27.136058 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.135979 2570 scope.go:117] "RemoveContainer" containerID="3dbf525410d214c6fdc30b136716582d8e6b07b798198f3c82c4ef7bf4ae177a" Apr 24 22:08:27.137450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.137427 2570 generic.go:358] "Generic (PLEG): container finished" podID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerID="b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9" exitCode=0 Apr 24 22:08:27.137688 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.137502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerDied","Data":"b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9"} Apr 24 22:08:27.145170 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.145145 2570 scope.go:117] "RemoveContainer" containerID="ec2b189cb0f587540b2f7b3fb508d81e10e3ef3424fcc352b32263a38ffaf1ec" Apr 24 22:08:27.152408 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.152154 2570 scope.go:117] "RemoveContainer" containerID="aa47481d368d53b9ef3d4868a45e9e190e519fc4fe126b4f2a71d2f68dadbd81" Apr 24 22:08:27.175050 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.175003 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw"] Apr 24 22:08:27.180109 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:27.180085 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-4spkw"] Apr 24 22:08:28.066668 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.066638 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" path="/var/lib/kubelet/pods/5abd18b1-6567-498c-b4d1-444e34d775a5/volumes" Apr 24 22:08:28.143086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.143053 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerStarted","Data":"385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81"} Apr 24 22:08:28.143086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.143088 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerStarted","Data":"053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019"} Apr 24 22:08:28.143513 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.143401 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:28.143554 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.143543 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:28.144846 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.144820 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:08:28.165682 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:28.165639 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podStartSLOduration=6.165624329 podStartE2EDuration="6.165624329s" podCreationTimestamp="2026-04-24 22:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:08:28.165300894 +0000 UTC m=+2460.588808301" watchObservedRunningTime="2026-04-24 22:08:28.165624329 +0000 UTC m=+2460.589131737" Apr 24 22:08:29.146269 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:29.146230 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:08:34.150371 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:34.150340 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:08:34.151068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:34.151017 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:08:44.151732 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:44.151691 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:08:54.151043 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:08:54.150978 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:09:04.151489 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:04.151448 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:09:14.151038 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:14.150939 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:09:24.151432 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:24.151393 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:09:34.151771 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:34.151740 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:09:42.696877 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.696842 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk"] Apr 24 22:09:42.697376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.697218 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" containerID="cri-o://053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019" gracePeriod=30 Apr 24 22:09:42.697376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.697252 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kube-rbac-proxy" containerID="cri-o://385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81" gracePeriod=30 Apr 24 22:09:42.776667 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776636 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597"] Apr 24 22:09:42.776906 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776895 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kube-rbac-proxy" Apr 24 22:09:42.776951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776908 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kube-rbac-proxy" Apr 24 22:09:42.776951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776920 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="storage-initializer" Apr 24 22:09:42.776951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776925 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="storage-initializer" Apr 24 22:09:42.776951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776934 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" Apr 24 22:09:42.776951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.776939 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" Apr 24 22:09:42.777135 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.777015 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kserve-container" Apr 24 22:09:42.777135 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.777046 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5abd18b1-6567-498c-b4d1-444e34d775a5" containerName="kube-rbac-proxy" Apr 24 22:09:42.780123 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.780103 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.782352 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.782330 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:09:42.782435 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.782373 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 24 22:09:42.789393 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.789372 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597"] Apr 24 22:09:42.851607 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.851580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a86be664-9220-4f49-9bec-959f2cc40c74-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.851752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.851621 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a86be664-9220-4f49-9bec-959f2cc40c74-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.851752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.851659 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a86be664-9220-4f49-9bec-959f2cc40c74-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.851752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.851699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn58m\" (UniqueName: \"kubernetes.io/projected/a86be664-9220-4f49-9bec-959f2cc40c74-kube-api-access-tn58m\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.952477 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.952401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a86be664-9220-4f49-9bec-959f2cc40c74-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.952477 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.952448 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a86be664-9220-4f49-9bec-959f2cc40c74-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.952662 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.952569 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a86be664-9220-4f49-9bec-959f2cc40c74-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.952662 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.952604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tn58m\" (UniqueName: \"kubernetes.io/projected/a86be664-9220-4f49-9bec-959f2cc40c74-kube-api-access-tn58m\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.952863 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.952841 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a86be664-9220-4f49-9bec-959f2cc40c74-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.953214 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.953197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a86be664-9220-4f49-9bec-959f2cc40c74-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.954951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.954935 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a86be664-9220-4f49-9bec-959f2cc40c74-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:42.960912 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:42.960890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn58m\" (UniqueName: \"kubernetes.io/projected/a86be664-9220-4f49-9bec-959f2cc40c74-kube-api-access-tn58m\") pod \"sklearn-v2-mlserver-predictor-65d8664766-bl597\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:43.090794 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.090757 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:43.210854 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.210806 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597"] Apr 24 22:09:43.213354 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:09:43.213324 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86be664_9220_4f49_9bec_959f2cc40c74.slice/crio-f458a72a2d61b9242235c159fbc3e5203c104200e6f41d58c64debcc343c7f37 WatchSource:0}: Error finding container f458a72a2d61b9242235c159fbc3e5203c104200e6f41d58c64debcc343c7f37: Status 404 returned error can't find the container with id f458a72a2d61b9242235c159fbc3e5203c104200e6f41d58c64debcc343c7f37 Apr 24 22:09:43.215004 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.214990 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:09:43.356529 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.356487 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerStarted","Data":"74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2"} Apr 24 22:09:43.356529 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.356530 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerStarted","Data":"f458a72a2d61b9242235c159fbc3e5203c104200e6f41d58c64debcc343c7f37"} Apr 24 22:09:43.358350 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.358326 2570 generic.go:358] "Generic (PLEG): container finished" podID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerID="385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81" exitCode=2 Apr 24 22:09:43.358463 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:43.358365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerDied","Data":"385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81"} Apr 24 22:09:44.147394 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:44.147352 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.46:8643/healthz\": dial tcp 10.133.0.46:8643: connect: connection refused" Apr 24 22:09:44.151706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:44.151683 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 22:09:46.736551 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.736528 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:09:46.883079 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.882982 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kserve-provision-location\") pod \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " Apr 24 22:09:46.883079 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.883046 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " Apr 24 22:09:46.883313 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.883098 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9h44\" (UniqueName: \"kubernetes.io/projected/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kube-api-access-t9h44\") pod \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " Apr 24 22:09:46.883313 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.883158 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-proxy-tls\") pod \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\" (UID: \"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0\") " Apr 24 22:09:46.883426 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.883344 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" (UID: "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:46.883486 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.883455 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" (UID: "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:09:46.885554 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.885524 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kube-api-access-t9h44" (OuterVolumeSpecName: "kube-api-access-t9h44") pod "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" (UID: "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0"). InnerVolumeSpecName "kube-api-access-t9h44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:09:46.885673 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.885587 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" (UID: "13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:09:46.984322 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.984281 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:09:46.984322 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.984313 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:09:46.984322 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.984326 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:09:46.984556 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:46.984337 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9h44\" (UniqueName: \"kubernetes.io/projected/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0-kube-api-access-t9h44\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:09:47.370890 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.370861 2570 generic.go:358] "Generic (PLEG): container finished" podID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerID="053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019" exitCode=0 Apr 24 22:09:47.371080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.370938 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerDied","Data":"053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019"} Apr 24 22:09:47.371080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.370946 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" Apr 24 22:09:47.371080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.370976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk" event={"ID":"13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0","Type":"ContainerDied","Data":"73a16430b92fcd1607534eebd8f9d15da208dc805d7a63a33006012cf5e203f0"} Apr 24 22:09:47.371080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.370993 2570 scope.go:117] "RemoveContainer" containerID="385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81" Apr 24 22:09:47.372417 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.372395 2570 generic.go:358] "Generic (PLEG): container finished" podID="a86be664-9220-4f49-9bec-959f2cc40c74" containerID="74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2" exitCode=0 Apr 24 22:09:47.372522 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.372420 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerDied","Data":"74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2"} Apr 24 22:09:47.379580 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.379561 2570 scope.go:117] "RemoveContainer" containerID="053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019" Apr 24 22:09:47.386632 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.386616 2570 scope.go:117] "RemoveContainer" containerID="b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9" Apr 24 22:09:47.393446 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.393429 2570 scope.go:117] "RemoveContainer" containerID="385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81" Apr 24 22:09:47.393700 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:09:47.393678 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81\": container with ID starting with 385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81 not found: ID does not exist" containerID="385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81" Apr 24 22:09:47.393785 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.393705 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81"} err="failed to get container status \"385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81\": rpc error: code = NotFound desc = could not find container \"385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81\": container with ID starting with 385846bc82de9120cc208311e5b85a96b8b9685bcf46ff8d3b3535e3cb05fa81 not found: ID does not exist" Apr 24 22:09:47.393785 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.393721 2570 scope.go:117] "RemoveContainer" containerID="053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019" Apr 24 22:09:47.394105 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:09:47.394086 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019\": container with ID starting with 053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019 not found: ID does not exist" containerID="053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019" Apr 24 22:09:47.394198 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.394114 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019"} err="failed to get container status \"053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019\": rpc error: code = NotFound desc = could not find container \"053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019\": container with ID starting with 053cd265c2587e074595e6d3f91f3318fa1310a5ca11556dee3bc97da9ae5019 not found: ID does not exist" Apr 24 22:09:47.394198 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.394130 2570 scope.go:117] "RemoveContainer" containerID="b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9" Apr 24 22:09:47.394869 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:09:47.394841 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9\": container with ID starting with b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9 not found: ID does not exist" containerID="b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9" Apr 24 22:09:47.395002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.394875 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9"} err="failed to get container status \"b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9\": rpc error: code = NotFound desc = could not find container \"b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9\": container with ID starting with b030109582b0926423ea9400186b9817fa8287dcc3e3b0eba0a938baccd9c3d9 not found: ID does not exist" Apr 24 22:09:47.405994 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.405974 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk"] Apr 24 22:09:47.411974 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:47.411954 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-9d8cff754-hq5lk"] Apr 24 22:09:48.066945 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:48.066912 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" path="/var/lib/kubelet/pods/13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0/volumes" Apr 24 22:09:48.377370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:48.377286 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerStarted","Data":"994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce"} Apr 24 22:09:48.377370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:48.377327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerStarted","Data":"93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994"} Apr 24 22:09:48.377587 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:48.377548 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:48.397767 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:48.397728 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" podStartSLOduration=6.397715658 podStartE2EDuration="6.397715658s" podCreationTimestamp="2026-04-24 22:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:09:48.395643394 +0000 UTC m=+2540.819150801" watchObservedRunningTime="2026-04-24 22:09:48.397715658 +0000 UTC m=+2540.821223111" Apr 24 22:09:49.379629 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:49.379600 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:09:55.388685 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:09:55.388658 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:10:25.418523 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:25.418471 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 22:10:35.391662 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:35.391625 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:10:42.901566 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.901480 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597"] Apr 24 22:10:42.901951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.901819 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kserve-container" containerID="cri-o://93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994" gracePeriod=30 Apr 24 22:10:42.901951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.901878 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kube-rbac-proxy" containerID="cri-o://994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce" gracePeriod=30 Apr 24 22:10:42.972892 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.972862 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78"] Apr 24 22:10:42.973160 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973147 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" Apr 24 22:10:42.973236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973163 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" Apr 24 22:10:42.973236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973172 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kube-rbac-proxy" Apr 24 22:10:42.973236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973188 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kube-rbac-proxy" Apr 24 22:10:42.973236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973206 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="storage-initializer" Apr 24 22:10:42.973236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973211 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="storage-initializer" Apr 24 22:10:42.973411 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973259 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kube-rbac-proxy" Apr 24 22:10:42.973411 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.973268 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="13ea4aca-2de6-4eb0-bb0d-dc2f5f09dce0" containerName="kserve-container" Apr 24 22:10:42.976185 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.976169 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:42.979800 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.979779 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 24 22:10:42.980129 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.980111 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:10:42.990617 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.990595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/389855e7-b863-47d4-86c2-049c8df0131a-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:42.990706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.990639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8r6g\" (UniqueName: \"kubernetes.io/projected/389855e7-b863-47d4-86c2-049c8df0131a-kube-api-access-g8r6g\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:42.990706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.990664 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/389855e7-b863-47d4-86c2-049c8df0131a-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:42.990706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.990686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:42.993556 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:42.993534 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78"] Apr 24 22:10:43.091148 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.091110 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8r6g\" (UniqueName: \"kubernetes.io/projected/389855e7-b863-47d4-86c2-049c8df0131a-kube-api-access-g8r6g\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.091334 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.091164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/389855e7-b863-47d4-86c2-049c8df0131a-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.091334 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.091211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.091334 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.091255 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/389855e7-b863-47d4-86c2-049c8df0131a-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.091334 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:10:43.091287 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-serving-cert: secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 24 22:10:43.091607 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:10:43.091367 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls podName:389855e7-b863-47d4-86c2-049c8df0131a nodeName:}" failed. No retries permitted until 2026-04-24 22:10:43.591345923 +0000 UTC m=+2596.014853309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls") pod "isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" (UID: "389855e7-b863-47d4-86c2-049c8df0131a") : secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 24 22:10:43.091607 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.091515 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/389855e7-b863-47d4-86c2-049c8df0131a-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.091896 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.091873 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/389855e7-b863-47d4-86c2-049c8df0131a-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.102086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.102060 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8r6g\" (UniqueName: \"kubernetes.io/projected/389855e7-b863-47d4-86c2-049c8df0131a-kube-api-access-g8r6g\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.533665 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.533631 2570 generic.go:358] "Generic (PLEG): container finished" podID="a86be664-9220-4f49-9bec-959f2cc40c74" containerID="994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce" exitCode=2 Apr 24 22:10:43.533833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.533672 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerDied","Data":"994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce"} Apr 24 22:10:43.595041 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.594992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.597460 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.597441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:43.885596 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:43.885517 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:44.011561 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:44.011530 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78"] Apr 24 22:10:44.014547 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:10:44.014519 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389855e7_b863_47d4_86c2_049c8df0131a.slice/crio-8bd7217505fa61966d2f0a071b6c840b7a99b4a1ab87ef14ec4913a037454586 WatchSource:0}: Error finding container 8bd7217505fa61966d2f0a071b6c840b7a99b4a1ab87ef14ec4913a037454586: Status 404 returned error can't find the container with id 8bd7217505fa61966d2f0a071b6c840b7a99b4a1ab87ef14ec4913a037454586 Apr 24 22:10:44.537734 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:44.537647 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerStarted","Data":"62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328"} Apr 24 22:10:44.537734 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:44.537692 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerStarted","Data":"8bd7217505fa61966d2f0a071b6c840b7a99b4a1ab87ef14ec4913a037454586"} Apr 24 22:10:45.384079 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:45.384010 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 24 22:10:49.248178 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:10:49.248143 2570 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod389855e7_b863_47d4_86c2_049c8df0131a.slice/crio-62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328.scope\": RecentStats: unable to find data in memory cache]" Apr 24 22:10:49.557843 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:49.557746 2570 generic.go:358] "Generic (PLEG): container finished" podID="389855e7-b863-47d4-86c2-049c8df0131a" containerID="62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328" exitCode=0 Apr 24 22:10:49.557843 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:49.557782 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerDied","Data":"62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328"} Apr 24 22:10:50.236930 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.236909 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:10:50.240254 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.240231 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn58m\" (UniqueName: \"kubernetes.io/projected/a86be664-9220-4f49-9bec-959f2cc40c74-kube-api-access-tn58m\") pod \"a86be664-9220-4f49-9bec-959f2cc40c74\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " Apr 24 22:10:50.240327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.240283 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a86be664-9220-4f49-9bec-959f2cc40c74-kserve-provision-location\") pod \"a86be664-9220-4f49-9bec-959f2cc40c74\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " Apr 24 22:10:50.240367 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.240351 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a86be664-9220-4f49-9bec-959f2cc40c74-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"a86be664-9220-4f49-9bec-959f2cc40c74\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " Apr 24 22:10:50.240412 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.240395 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a86be664-9220-4f49-9bec-959f2cc40c74-proxy-tls\") pod \"a86be664-9220-4f49-9bec-959f2cc40c74\" (UID: \"a86be664-9220-4f49-9bec-959f2cc40c74\") " Apr 24 22:10:50.240588 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.240550 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86be664-9220-4f49-9bec-959f2cc40c74-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a86be664-9220-4f49-9bec-959f2cc40c74" (UID: "a86be664-9220-4f49-9bec-959f2cc40c74"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:10:50.240752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.240720 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a86be664-9220-4f49-9bec-959f2cc40c74-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "a86be664-9220-4f49-9bec-959f2cc40c74" (UID: "a86be664-9220-4f49-9bec-959f2cc40c74"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:10:50.242594 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.242560 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86be664-9220-4f49-9bec-959f2cc40c74-kube-api-access-tn58m" (OuterVolumeSpecName: "kube-api-access-tn58m") pod "a86be664-9220-4f49-9bec-959f2cc40c74" (UID: "a86be664-9220-4f49-9bec-959f2cc40c74"). InnerVolumeSpecName "kube-api-access-tn58m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:10:50.242594 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.242575 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a86be664-9220-4f49-9bec-959f2cc40c74-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a86be664-9220-4f49-9bec-959f2cc40c74" (UID: "a86be664-9220-4f49-9bec-959f2cc40c74"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:10:50.341778 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.341692 2570 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a86be664-9220-4f49-9bec-959f2cc40c74-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:10:50.341778 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.341733 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a86be664-9220-4f49-9bec-959f2cc40c74-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:10:50.341778 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.341750 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tn58m\" (UniqueName: \"kubernetes.io/projected/a86be664-9220-4f49-9bec-959f2cc40c74-kube-api-access-tn58m\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:10:50.341778 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.341766 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a86be664-9220-4f49-9bec-959f2cc40c74-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:10:50.562575 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.562539 2570 generic.go:358] "Generic (PLEG): container finished" podID="a86be664-9220-4f49-9bec-959f2cc40c74" containerID="93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994" exitCode=0 Apr 24 22:10:50.562757 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.562625 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" Apr 24 22:10:50.562757 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.562635 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerDied","Data":"93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994"} Apr 24 22:10:50.562757 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.562685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597" event={"ID":"a86be664-9220-4f49-9bec-959f2cc40c74","Type":"ContainerDied","Data":"f458a72a2d61b9242235c159fbc3e5203c104200e6f41d58c64debcc343c7f37"} Apr 24 22:10:50.562757 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.562711 2570 scope.go:117] "RemoveContainer" containerID="994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce" Apr 24 22:10:50.564825 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.564800 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerStarted","Data":"500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03"} Apr 24 22:10:50.564957 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.564836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerStarted","Data":"3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5"} Apr 24 22:10:50.565131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.565069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:50.571911 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.571886 2570 scope.go:117] "RemoveContainer" containerID="93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994" Apr 24 22:10:50.578733 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.578714 2570 scope.go:117] "RemoveContainer" containerID="74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2" Apr 24 22:10:50.585250 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.585236 2570 scope.go:117] "RemoveContainer" containerID="994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce" Apr 24 22:10:50.585484 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:10:50.585469 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce\": container with ID starting with 994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce not found: ID does not exist" containerID="994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce" Apr 24 22:10:50.585528 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.585492 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce"} err="failed to get container status \"994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce\": rpc error: code = NotFound desc = could not find container \"994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce\": container with ID starting with 994c3c47684607287801d6f668cc2eb0b49f31103f44594fb13f5a7af47489ce not found: ID does not exist" Apr 24 22:10:50.585528 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.585507 2570 scope.go:117] "RemoveContainer" containerID="93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994" Apr 24 22:10:50.585753 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:10:50.585732 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994\": container with ID starting with 93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994 not found: ID does not exist" containerID="93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994" Apr 24 22:10:50.585808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.585760 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994"} err="failed to get container status \"93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994\": rpc error: code = NotFound desc = could not find container \"93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994\": container with ID starting with 93532160f8c58faf5126f142ea9ecf01e6d934fa08bf326e07c00df7cb532994 not found: ID does not exist" Apr 24 22:10:50.585808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.585776 2570 scope.go:117] "RemoveContainer" containerID="74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2" Apr 24 22:10:50.586037 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:10:50.586007 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2\": container with ID starting with 74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2 not found: ID does not exist" containerID="74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2" Apr 24 22:10:50.586082 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.586045 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2"} err="failed to get container status \"74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2\": rpc error: code = NotFound desc = could not find container \"74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2\": container with ID starting with 74ee2dc6acd5214911abecc8136f385a8ff5c48a939f144b0de0b089a72d53a2 not found: ID does not exist" Apr 24 22:10:50.589233 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.589196 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" podStartSLOduration=8.589175818 podStartE2EDuration="8.589175818s" podCreationTimestamp="2026-04-24 22:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:10:50.588108437 +0000 UTC m=+2603.011615847" watchObservedRunningTime="2026-04-24 22:10:50.589175818 +0000 UTC m=+2603.012683225" Apr 24 22:10:50.602871 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.602820 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597"] Apr 24 22:10:50.611009 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:50.610990 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-bl597"] Apr 24 22:10:51.568876 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:51.568845 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:51.570227 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:51.570196 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 22:10:52.066836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:52.066804 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" path="/var/lib/kubelet/pods/a86be664-9220-4f49-9bec-959f2cc40c74/volumes" Apr 24 22:10:52.571808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:52.571770 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 22:10:57.576440 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:57.576412 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:10:57.577006 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:10:57.576978 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 22:11:07.577214 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:07.577175 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:11:19.816935 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:19.816895 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78_389855e7-b863-47d4-86c2-049c8df0131a/kserve-container/0.log" Apr 24 22:11:19.955143 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:19.955111 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78"] Apr 24 22:11:19.955470 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:19.955426 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" containerID="cri-o://3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5" gracePeriod=30 Apr 24 22:11:19.955633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:19.955459 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kube-rbac-proxy" containerID="cri-o://500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03" gracePeriod=30 Apr 24 22:11:20.029698 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029666 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj"] Apr 24 22:11:20.029933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029922 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kube-rbac-proxy" Apr 24 22:11:20.029976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029935 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kube-rbac-proxy" Apr 24 22:11:20.029976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029952 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="storage-initializer" Apr 24 22:11:20.029976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029957 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="storage-initializer" Apr 24 22:11:20.029976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029965 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kserve-container" Apr 24 22:11:20.029976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.029971 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kserve-container" Apr 24 22:11:20.030294 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.030050 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kube-rbac-proxy" Apr 24 22:11:20.030294 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.030058 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a86be664-9220-4f49-9bec-959f2cc40c74" containerName="kserve-container" Apr 24 22:11:20.033094 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.033076 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.035224 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.035204 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 24 22:11:20.035315 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.035228 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:11:20.043617 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.043597 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj"] Apr 24 22:11:20.160260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.160227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4edf360b-fc70-424f-8c05-47887616fec0-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.160366 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.160273 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.160448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.160365 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx22p\" (UniqueName: \"kubernetes.io/projected/4edf360b-fc70-424f-8c05-47887616fec0-kube-api-access-mx22p\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.160448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.160408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4edf360b-fc70-424f-8c05-47887616fec0-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.261527 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.261503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.261633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.261543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx22p\" (UniqueName: \"kubernetes.io/projected/4edf360b-fc70-424f-8c05-47887616fec0-kube-api-access-mx22p\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.261633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.261567 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4edf360b-fc70-424f-8c05-47887616fec0-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.261633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.261600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4edf360b-fc70-424f-8c05-47887616fec0-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.261819 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:20.261645 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-serving-cert: secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 24 22:11:20.261819 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:20.261713 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls podName:4edf360b-fc70-424f-8c05-47887616fec0 nodeName:}" failed. No retries permitted until 2026-04-24 22:11:20.761693354 +0000 UTC m=+2633.185200739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls") pod "isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" (UID: "4edf360b-fc70-424f-8c05-47887616fec0") : secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 24 22:11:20.261912 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.261895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4edf360b-fc70-424f-8c05-47887616fec0-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.262205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.262188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4edf360b-fc70-424f-8c05-47887616fec0-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.273176 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.273148 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx22p\" (UniqueName: \"kubernetes.io/projected/4edf360b-fc70-424f-8c05-47887616fec0-kube-api-access-mx22p\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.665286 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.665251 2570 generic.go:358] "Generic (PLEG): container finished" podID="389855e7-b863-47d4-86c2-049c8df0131a" containerID="500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03" exitCode=2 Apr 24 22:11:20.665455 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.665292 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerDied","Data":"500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03"} Apr 24 22:11:20.765077 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:20.765042 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:20.765259 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:20.765195 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-serving-cert: secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 24 22:11:20.765302 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:20.765262 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls podName:4edf360b-fc70-424f-8c05-47887616fec0 nodeName:}" failed. No retries permitted until 2026-04-24 22:11:21.765244386 +0000 UTC m=+2634.188751774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls") pod "isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" (UID: "4edf360b-fc70-424f-8c05-47887616fec0") : secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 24 22:11:21.084088 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.084064 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:11:21.168126 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.168099 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls\") pod \"389855e7-b863-47d4-86c2-049c8df0131a\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " Apr 24 22:11:21.168271 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.168172 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/389855e7-b863-47d4-86c2-049c8df0131a-kserve-provision-location\") pod \"389855e7-b863-47d4-86c2-049c8df0131a\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " Apr 24 22:11:21.168346 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.168319 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/389855e7-b863-47d4-86c2-049c8df0131a-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"389855e7-b863-47d4-86c2-049c8df0131a\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " Apr 24 22:11:21.168410 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.168355 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8r6g\" (UniqueName: \"kubernetes.io/projected/389855e7-b863-47d4-86c2-049c8df0131a-kube-api-access-g8r6g\") pod \"389855e7-b863-47d4-86c2-049c8df0131a\" (UID: \"389855e7-b863-47d4-86c2-049c8df0131a\") " Apr 24 22:11:21.168636 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.168613 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/389855e7-b863-47d4-86c2-049c8df0131a-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "389855e7-b863-47d4-86c2-049c8df0131a" (UID: "389855e7-b863-47d4-86c2-049c8df0131a"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:11:21.170257 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.170233 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "389855e7-b863-47d4-86c2-049c8df0131a" (UID: "389855e7-b863-47d4-86c2-049c8df0131a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:11:21.170593 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.170574 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/389855e7-b863-47d4-86c2-049c8df0131a-kube-api-access-g8r6g" (OuterVolumeSpecName: "kube-api-access-g8r6g") pod "389855e7-b863-47d4-86c2-049c8df0131a" (UID: "389855e7-b863-47d4-86c2-049c8df0131a"). InnerVolumeSpecName "kube-api-access-g8r6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:11:21.192203 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.192177 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/389855e7-b863-47d4-86c2-049c8df0131a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "389855e7-b863-47d4-86c2-049c8df0131a" (UID: "389855e7-b863-47d4-86c2-049c8df0131a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:21.269194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.269130 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/389855e7-b863-47d4-86c2-049c8df0131a-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:11:21.269194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.269151 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/389855e7-b863-47d4-86c2-049c8df0131a-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:11:21.269194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.269163 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g8r6g\" (UniqueName: \"kubernetes.io/projected/389855e7-b863-47d4-86c2-049c8df0131a-kube-api-access-g8r6g\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:11:21.269194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.269172 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/389855e7-b863-47d4-86c2-049c8df0131a-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:11:21.669978 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.669945 2570 generic.go:358] "Generic (PLEG): container finished" podID="389855e7-b863-47d4-86c2-049c8df0131a" containerID="3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5" exitCode=0 Apr 24 22:11:21.670171 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.669986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerDied","Data":"3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5"} Apr 24 22:11:21.670171 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.670017 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" event={"ID":"389855e7-b863-47d4-86c2-049c8df0131a","Type":"ContainerDied","Data":"8bd7217505fa61966d2f0a071b6c840b7a99b4a1ab87ef14ec4913a037454586"} Apr 24 22:11:21.670171 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.670054 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78" Apr 24 22:11:21.670171 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.670059 2570 scope.go:117] "RemoveContainer" containerID="500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03" Apr 24 22:11:21.677900 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.677874 2570 scope.go:117] "RemoveContainer" containerID="3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5" Apr 24 22:11:21.684775 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.684758 2570 scope.go:117] "RemoveContainer" containerID="62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328" Apr 24 22:11:21.691987 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.691967 2570 scope.go:117] "RemoveContainer" containerID="500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03" Apr 24 22:11:21.692292 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:21.692272 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03\": container with ID starting with 500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03 not found: ID does not exist" containerID="500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03" Apr 24 22:11:21.692377 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.692299 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03"} err="failed to get container status \"500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03\": rpc error: code = NotFound desc = could not find container \"500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03\": container with ID starting with 500de1a46fa2652cd00fec6b4d1ec66656d47c7825d18b1d5c2dda26b3ff6f03 not found: ID does not exist" Apr 24 22:11:21.692377 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.692316 2570 scope.go:117] "RemoveContainer" containerID="3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5" Apr 24 22:11:21.692543 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:21.692521 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5\": container with ID starting with 3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5 not found: ID does not exist" containerID="3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5" Apr 24 22:11:21.692649 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.692546 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5"} err="failed to get container status \"3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5\": rpc error: code = NotFound desc = could not find container \"3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5\": container with ID starting with 3223758e2ceea18b5cabce0d03bbcaffb277fb5e3f01126592baf69f03fbc4d5 not found: ID does not exist" Apr 24 22:11:21.692649 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.692559 2570 scope.go:117] "RemoveContainer" containerID="62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328" Apr 24 22:11:21.692843 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:11:21.692820 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328\": container with ID starting with 62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328 not found: ID does not exist" containerID="62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328" Apr 24 22:11:21.692888 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.692857 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328"} err="failed to get container status \"62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328\": rpc error: code = NotFound desc = could not find container \"62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328\": container with ID starting with 62db5db7c4e504eb3d8086324c9d5d7145f119b6904af12a92ef53402428e328 not found: ID does not exist" Apr 24 22:11:21.692888 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.692830 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78"] Apr 24 22:11:21.700506 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.700484 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-66f5c6b7dd-v7l78"] Apr 24 22:11:21.772969 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.772933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:21.775400 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.775377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:21.842828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.842798 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:21.961512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:21.961426 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj"] Apr 24 22:11:21.965336 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:11:21.965308 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4edf360b_fc70_424f_8c05_47887616fec0.slice/crio-faf3930c44b360546856f5d527f439f460b8874bf11ff4490eeb73e3b7e7cf2f WatchSource:0}: Error finding container faf3930c44b360546856f5d527f439f460b8874bf11ff4490eeb73e3b7e7cf2f: Status 404 returned error can't find the container with id faf3930c44b360546856f5d527f439f460b8874bf11ff4490eeb73e3b7e7cf2f Apr 24 22:11:22.067307 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:22.067282 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="389855e7-b863-47d4-86c2-049c8df0131a" path="/var/lib/kubelet/pods/389855e7-b863-47d4-86c2-049c8df0131a/volumes" Apr 24 22:11:22.673553 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:22.673514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerStarted","Data":"f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e"} Apr 24 22:11:22.673553 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:22.673558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerStarted","Data":"faf3930c44b360546856f5d527f439f460b8874bf11ff4490eeb73e3b7e7cf2f"} Apr 24 22:11:25.684502 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:25.684464 2570 generic.go:358] "Generic (PLEG): container finished" podID="4edf360b-fc70-424f-8c05-47887616fec0" containerID="f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e" exitCode=0 Apr 24 22:11:25.684502 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:25.684506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerDied","Data":"f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e"} Apr 24 22:11:26.688953 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:26.688914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerStarted","Data":"d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12"} Apr 24 22:11:26.689376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:26.688962 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerStarted","Data":"571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8"} Apr 24 22:11:26.689376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:26.689208 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:26.721529 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:26.721475 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" podStartSLOduration=6.721462867 podStartE2EDuration="6.721462867s" podCreationTimestamp="2026-04-24 22:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:11:26.720963647 +0000 UTC m=+2639.144471054" watchObservedRunningTime="2026-04-24 22:11:26.721462867 +0000 UTC m=+2639.144970274" Apr 24 22:11:27.692545 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:27.692519 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:11:33.699672 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:11:33.699638 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:12:03.717418 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:03.717372 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 22:12:13.703044 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:13.702960 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:12:20.204145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204109 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb"] Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204399 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kube-rbac-proxy" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204411 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kube-rbac-proxy" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204423 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="storage-initializer" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204428 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="storage-initializer" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204438 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204443 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204486 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kserve-container" Apr 24 22:12:20.204707 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.204497 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="389855e7-b863-47d4-86c2-049c8df0131a" containerName="kube-rbac-proxy" Apr 24 22:12:20.207542 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.207520 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.209962 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.209941 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 24 22:12:20.210134 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.210106 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:12:20.222486 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.222463 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb"] Apr 24 22:12:20.236116 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.236097 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj"] Apr 24 22:12:20.236376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.236358 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kserve-container" containerID="cri-o://571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8" gracePeriod=30 Apr 24 22:12:20.236454 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.236432 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kube-rbac-proxy" containerID="cri-o://d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12" gracePeriod=30 Apr 24 22:12:20.395454 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.395426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svpfg\" (UniqueName: \"kubernetes.io/projected/395d5fa0-85d1-4121-86c9-730d224a3ec9-kube-api-access-svpfg\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.395578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.395467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/395d5fa0-85d1-4121-86c9-730d224a3ec9-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.395578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.395522 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.395578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.395551 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/395d5fa0-85d1-4121-86c9-730d224a3ec9-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.496177 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.496102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/395d5fa0-85d1-4121-86c9-730d224a3ec9-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.496177 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.496162 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svpfg\" (UniqueName: \"kubernetes.io/projected/395d5fa0-85d1-4121-86c9-730d224a3ec9-kube-api-access-svpfg\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.496390 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.496207 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/395d5fa0-85d1-4121-86c9-730d224a3ec9-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.496390 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.496257 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.496390 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:12:20.496379 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 22:12:20.496542 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:12:20.496447 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls podName:395d5fa0-85d1-4121-86c9-730d224a3ec9 nodeName:}" failed. No retries permitted until 2026-04-24 22:12:20.996426389 +0000 UTC m=+2693.419933780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls") pod "isvc-sklearn-v2-predictor-868cb76d-vmbnb" (UID: "395d5fa0-85d1-4121-86c9-730d224a3ec9") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 22:12:20.496630 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.496608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/395d5fa0-85d1-4121-86c9-730d224a3ec9-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.496720 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.496703 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/395d5fa0-85d1-4121-86c9-730d224a3ec9-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.504798 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.504778 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svpfg\" (UniqueName: \"kubernetes.io/projected/395d5fa0-85d1-4121-86c9-730d224a3ec9-kube-api-access-svpfg\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:20.842082 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.841987 2570 generic.go:358] "Generic (PLEG): container finished" podID="4edf360b-fc70-424f-8c05-47887616fec0" containerID="d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12" exitCode=2 Apr 24 22:12:20.842082 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.842050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerDied","Data":"d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12"} Apr 24 22:12:20.999960 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:20.999928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:21.002401 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:21.002380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls\") pod \"isvc-sklearn-v2-predictor-868cb76d-vmbnb\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:21.116561 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:21.116488 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:21.236010 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:21.235986 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb"] Apr 24 22:12:21.238467 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:12:21.238447 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395d5fa0_85d1_4121_86c9_730d224a3ec9.slice/crio-95601067ecafe20006ca6a923f56e8d30e652b05a3997f9683c3ab4cbb13fbbe WatchSource:0}: Error finding container 95601067ecafe20006ca6a923f56e8d30e652b05a3997f9683c3ab4cbb13fbbe: Status 404 returned error can't find the container with id 95601067ecafe20006ca6a923f56e8d30e652b05a3997f9683c3ab4cbb13fbbe Apr 24 22:12:21.848659 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:21.848624 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerStarted","Data":"6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429"} Apr 24 22:12:21.848659 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:21.848661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerStarted","Data":"95601067ecafe20006ca6a923f56e8d30e652b05a3997f9683c3ab4cbb13fbbe"} Apr 24 22:12:23.695878 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:23.695835 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.49:8643/healthz\": dial tcp 10.133.0.49:8643: connect: connection refused" Apr 24 22:12:25.860651 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:25.860614 2570 generic.go:358] "Generic (PLEG): container finished" podID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerID="6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429" exitCode=0 Apr 24 22:12:25.861103 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:25.860686 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerDied","Data":"6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429"} Apr 24 22:12:26.865387 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:26.865352 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerStarted","Data":"715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570"} Apr 24 22:12:26.865387 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:26.865392 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerStarted","Data":"cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203"} Apr 24 22:12:26.865802 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:26.865737 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:26.865884 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:26.865864 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:26.867169 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:26.867144 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:12:26.884891 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:26.884842 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podStartSLOduration=6.884823375 podStartE2EDuration="6.884823375s" podCreationTimestamp="2026-04-24 22:12:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:12:26.883224327 +0000 UTC m=+2699.306731732" watchObservedRunningTime="2026-04-24 22:12:26.884823375 +0000 UTC m=+2699.308330786" Apr 24 22:12:27.476363 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.476335 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:12:27.648833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.648761 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx22p\" (UniqueName: \"kubernetes.io/projected/4edf360b-fc70-424f-8c05-47887616fec0-kube-api-access-mx22p\") pod \"4edf360b-fc70-424f-8c05-47887616fec0\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " Apr 24 22:12:27.648833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.648809 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") pod \"4edf360b-fc70-424f-8c05-47887616fec0\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " Apr 24 22:12:27.648833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.648831 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4edf360b-fc70-424f-8c05-47887616fec0-kserve-provision-location\") pod \"4edf360b-fc70-424f-8c05-47887616fec0\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " Apr 24 22:12:27.649121 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.648858 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4edf360b-fc70-424f-8c05-47887616fec0-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"4edf360b-fc70-424f-8c05-47887616fec0\" (UID: \"4edf360b-fc70-424f-8c05-47887616fec0\") " Apr 24 22:12:27.649195 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.649158 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4edf360b-fc70-424f-8c05-47887616fec0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4edf360b-fc70-424f-8c05-47887616fec0" (UID: "4edf360b-fc70-424f-8c05-47887616fec0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:27.649298 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.649273 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edf360b-fc70-424f-8c05-47887616fec0-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "4edf360b-fc70-424f-8c05-47887616fec0" (UID: "4edf360b-fc70-424f-8c05-47887616fec0"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:12:27.651000 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.650974 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edf360b-fc70-424f-8c05-47887616fec0-kube-api-access-mx22p" (OuterVolumeSpecName: "kube-api-access-mx22p") pod "4edf360b-fc70-424f-8c05-47887616fec0" (UID: "4edf360b-fc70-424f-8c05-47887616fec0"). InnerVolumeSpecName "kube-api-access-mx22p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:12:27.651000 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.650977 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4edf360b-fc70-424f-8c05-47887616fec0" (UID: "4edf360b-fc70-424f-8c05-47887616fec0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:12:27.749456 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.749419 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4edf360b-fc70-424f-8c05-47887616fec0-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:12:27.749456 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.749452 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4edf360b-fc70-424f-8c05-47887616fec0-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:12:27.749642 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.749467 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4edf360b-fc70-424f-8c05-47887616fec0-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:12:27.749642 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.749482 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mx22p\" (UniqueName: \"kubernetes.io/projected/4edf360b-fc70-424f-8c05-47887616fec0-kube-api-access-mx22p\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:12:27.869571 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.869540 2570 generic.go:358] "Generic (PLEG): container finished" podID="4edf360b-fc70-424f-8c05-47887616fec0" containerID="571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8" exitCode=0 Apr 24 22:12:27.869949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.869626 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" Apr 24 22:12:27.869949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.869622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerDied","Data":"571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8"} Apr 24 22:12:27.869949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.869662 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj" event={"ID":"4edf360b-fc70-424f-8c05-47887616fec0","Type":"ContainerDied","Data":"faf3930c44b360546856f5d527f439f460b8874bf11ff4490eeb73e3b7e7cf2f"} Apr 24 22:12:27.869949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.869683 2570 scope.go:117] "RemoveContainer" containerID="d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12" Apr 24 22:12:27.870327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.870303 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:12:27.877974 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.877959 2570 scope.go:117] "RemoveContainer" containerID="571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8" Apr 24 22:12:27.885570 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.885555 2570 scope.go:117] "RemoveContainer" containerID="f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e" Apr 24 22:12:27.890780 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.890761 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj"] Apr 24 22:12:27.892620 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.892545 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vtgpj"] Apr 24 22:12:27.892684 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.892644 2570 scope.go:117] "RemoveContainer" containerID="d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12" Apr 24 22:12:27.892913 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:12:27.892895 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12\": container with ID starting with d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12 not found: ID does not exist" containerID="d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12" Apr 24 22:12:27.892964 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.892923 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12"} err="failed to get container status \"d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12\": rpc error: code = NotFound desc = could not find container \"d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12\": container with ID starting with d5c588b20122b688ae1aa0ee549ca1f1f574d668f694420d8886a746b299ca12 not found: ID does not exist" Apr 24 22:12:27.892964 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.892940 2570 scope.go:117] "RemoveContainer" containerID="571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8" Apr 24 22:12:27.893220 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:12:27.893202 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8\": container with ID starting with 571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8 not found: ID does not exist" containerID="571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8" Apr 24 22:12:27.893272 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.893226 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8"} err="failed to get container status \"571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8\": rpc error: code = NotFound desc = could not find container \"571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8\": container with ID starting with 571ec23f95f8bf8342941ebe1fa0c552c68f26e33b17bffafb29657532db5eb8 not found: ID does not exist" Apr 24 22:12:27.893272 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.893241 2570 scope.go:117] "RemoveContainer" containerID="f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e" Apr 24 22:12:27.893470 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:12:27.893453 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e\": container with ID starting with f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e not found: ID does not exist" containerID="f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e" Apr 24 22:12:27.893525 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:27.893478 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e"} err="failed to get container status \"f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e\": rpc error: code = NotFound desc = could not find container \"f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e\": container with ID starting with f3117b1764ab4a56ba40ae74cd701043cd8cb320f5194d36e4e59508699a009e not found: ID does not exist" Apr 24 22:12:28.067692 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:28.067662 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edf360b-fc70-424f-8c05-47887616fec0" path="/var/lib/kubelet/pods/4edf360b-fc70-424f-8c05-47887616fec0/volumes" Apr 24 22:12:28.143295 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:28.143270 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:12:28.156036 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:28.155995 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:12:32.874553 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:32.874527 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:12:32.875134 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:32.875110 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:12:42.875068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:42.875006 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:12:52.875911 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:12:52.875874 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:13:02.875317 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:02.875276 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:13:12.875909 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:12.875874 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:13:22.875497 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:22.875457 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:13:32.876164 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:32.876136 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:13:40.432223 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.432192 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb"] Apr 24 22:13:40.432727 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.432528 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" containerID="cri-o://cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203" gracePeriod=30 Apr 24 22:13:40.432727 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.432580 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kube-rbac-proxy" containerID="cri-o://715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570" gracePeriod=30 Apr 24 22:13:40.516196 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516160 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx"] Apr 24 22:13:40.516499 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516482 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="storage-initializer" Apr 24 22:13:40.516578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516502 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="storage-initializer" Apr 24 22:13:40.516578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516519 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kube-rbac-proxy" Apr 24 22:13:40.516578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516528 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kube-rbac-proxy" Apr 24 22:13:40.516578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516557 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kserve-container" Apr 24 22:13:40.516578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516566 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kserve-container" Apr 24 22:13:40.516834 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516637 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kube-rbac-proxy" Apr 24 22:13:40.516834 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.516650 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4edf360b-fc70-424f-8c05-47887616fec0" containerName="kserve-container" Apr 24 22:13:40.519766 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.519745 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.521822 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.521799 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 24 22:13:40.521996 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.521981 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 24 22:13:40.529424 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.529400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx"] Apr 24 22:13:40.640052 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.640008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.640233 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.640076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98ce335e-5f51-4252-9f18-cb3599cfc0eb-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.640233 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.640110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98ce335e-5f51-4252-9f18-cb3599cfc0eb-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.640233 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.640143 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdt44\" (UniqueName: \"kubernetes.io/projected/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kube-api-access-qdt44\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.741086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.740942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98ce335e-5f51-4252-9f18-cb3599cfc0eb-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.741086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.740983 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98ce335e-5f51-4252-9f18-cb3599cfc0eb-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.741086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.741011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdt44\" (UniqueName: \"kubernetes.io/projected/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kube-api-access-qdt44\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.741086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.741069 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.741445 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.741433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.741814 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.741786 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98ce335e-5f51-4252-9f18-cb3599cfc0eb-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.743644 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.743625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98ce335e-5f51-4252-9f18-cb3599cfc0eb-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.749476 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.749452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdt44\" (UniqueName: \"kubernetes.io/projected/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kube-api-access-qdt44\") pod \"isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.830469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.830424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:40.950887 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:40.950828 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx"] Apr 24 22:13:40.953805 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:13:40.953770 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98ce335e_5f51_4252_9f18_cb3599cfc0eb.slice/crio-8f2290baf08badb9d62ce846c9d0abfe333caaf15ecde866ca20d884f6f64139 WatchSource:0}: Error finding container 8f2290baf08badb9d62ce846c9d0abfe333caaf15ecde866ca20d884f6f64139: Status 404 returned error can't find the container with id 8f2290baf08badb9d62ce846c9d0abfe333caaf15ecde866ca20d884f6f64139 Apr 24 22:13:41.076599 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:41.076555 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerStarted","Data":"11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8"} Apr 24 22:13:41.076599 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:41.076595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerStarted","Data":"8f2290baf08badb9d62ce846c9d0abfe333caaf15ecde866ca20d884f6f64139"} Apr 24 22:13:41.078533 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:41.078508 2570 generic.go:358] "Generic (PLEG): container finished" podID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerID="715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570" exitCode=2 Apr 24 22:13:41.078644 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:41.078563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerDied","Data":"715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570"} Apr 24 22:13:42.871187 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:42.871152 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.50:8643/healthz\": dial tcp 10.133.0.50:8643: connect: connection refused" Apr 24 22:13:42.875494 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:42.875474 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 24 22:13:44.466473 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.466447 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:13:44.568907 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.568876 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/395d5fa0-85d1-4121-86c9-730d224a3ec9-kserve-provision-location\") pod \"395d5fa0-85d1-4121-86c9-730d224a3ec9\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " Apr 24 22:13:44.568907 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.568910 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svpfg\" (UniqueName: \"kubernetes.io/projected/395d5fa0-85d1-4121-86c9-730d224a3ec9-kube-api-access-svpfg\") pod \"395d5fa0-85d1-4121-86c9-730d224a3ec9\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " Apr 24 22:13:44.569106 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.568967 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls\") pod \"395d5fa0-85d1-4121-86c9-730d224a3ec9\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " Apr 24 22:13:44.569106 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.568992 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/395d5fa0-85d1-4121-86c9-730d224a3ec9-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"395d5fa0-85d1-4121-86c9-730d224a3ec9\" (UID: \"395d5fa0-85d1-4121-86c9-730d224a3ec9\") " Apr 24 22:13:44.569218 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.569201 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395d5fa0-85d1-4121-86c9-730d224a3ec9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "395d5fa0-85d1-4121-86c9-730d224a3ec9" (UID: "395d5fa0-85d1-4121-86c9-730d224a3ec9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:13:44.569460 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.569437 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395d5fa0-85d1-4121-86c9-730d224a3ec9-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "395d5fa0-85d1-4121-86c9-730d224a3ec9" (UID: "395d5fa0-85d1-4121-86c9-730d224a3ec9"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:13:44.571127 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.571102 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "395d5fa0-85d1-4121-86c9-730d224a3ec9" (UID: "395d5fa0-85d1-4121-86c9-730d224a3ec9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:13:44.571213 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.571139 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395d5fa0-85d1-4121-86c9-730d224a3ec9-kube-api-access-svpfg" (OuterVolumeSpecName: "kube-api-access-svpfg") pod "395d5fa0-85d1-4121-86c9-730d224a3ec9" (UID: "395d5fa0-85d1-4121-86c9-730d224a3ec9"). InnerVolumeSpecName "kube-api-access-svpfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:13:44.669434 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.669416 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/395d5fa0-85d1-4121-86c9-730d224a3ec9-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:13:44.669434 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.669434 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/395d5fa0-85d1-4121-86c9-730d224a3ec9-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:13:44.669560 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.669444 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/395d5fa0-85d1-4121-86c9-730d224a3ec9-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:13:44.669560 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:44.669454 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svpfg\" (UniqueName: \"kubernetes.io/projected/395d5fa0-85d1-4121-86c9-730d224a3ec9-kube-api-access-svpfg\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:13:45.095143 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.095111 2570 generic.go:358] "Generic (PLEG): container finished" podID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerID="cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203" exitCode=0 Apr 24 22:13:45.095283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.095192 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" Apr 24 22:13:45.095283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.095194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerDied","Data":"cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203"} Apr 24 22:13:45.095283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.095229 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb" event={"ID":"395d5fa0-85d1-4121-86c9-730d224a3ec9","Type":"ContainerDied","Data":"95601067ecafe20006ca6a923f56e8d30e652b05a3997f9683c3ab4cbb13fbbe"} Apr 24 22:13:45.095283 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.095249 2570 scope.go:117] "RemoveContainer" containerID="715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570" Apr 24 22:13:45.096622 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.096605 2570 generic.go:358] "Generic (PLEG): container finished" podID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerID="11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8" exitCode=0 Apr 24 22:13:45.096766 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.096690 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerDied","Data":"11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8"} Apr 24 22:13:45.103804 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.103716 2570 scope.go:117] "RemoveContainer" containerID="cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203" Apr 24 22:13:45.112288 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.112272 2570 scope.go:117] "RemoveContainer" containerID="6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429" Apr 24 22:13:45.120580 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.120564 2570 scope.go:117] "RemoveContainer" containerID="715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570" Apr 24 22:13:45.120835 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:13:45.120817 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570\": container with ID starting with 715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570 not found: ID does not exist" containerID="715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570" Apr 24 22:13:45.120906 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.120845 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570"} err="failed to get container status \"715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570\": rpc error: code = NotFound desc = could not find container \"715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570\": container with ID starting with 715f86d43dd38a8659598b33ec2b9db318bfab6935bd1ae9bee89336334ae570 not found: ID does not exist" Apr 24 22:13:45.120906 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.120866 2570 scope.go:117] "RemoveContainer" containerID="cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203" Apr 24 22:13:45.121132 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:13:45.121113 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203\": container with ID starting with cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203 not found: ID does not exist" containerID="cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203" Apr 24 22:13:45.121177 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.121139 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203"} err="failed to get container status \"cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203\": rpc error: code = NotFound desc = could not find container \"cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203\": container with ID starting with cf96baf145a55f3b148ca443f682183212bfb3e89826dfe97c56e27a36a59203 not found: ID does not exist" Apr 24 22:13:45.121177 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.121154 2570 scope.go:117] "RemoveContainer" containerID="6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429" Apr 24 22:13:45.121419 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:13:45.121403 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429\": container with ID starting with 6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429 not found: ID does not exist" containerID="6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429" Apr 24 22:13:45.121475 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.121427 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429"} err="failed to get container status \"6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429\": rpc error: code = NotFound desc = could not find container \"6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429\": container with ID starting with 6dd4b7778bae3af32f7cf1905a15d3f80abcb5672f18f0d21f3a553189591429 not found: ID does not exist" Apr 24 22:13:45.128830 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.128780 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb"] Apr 24 22:13:45.132738 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:45.132710 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-868cb76d-vmbnb"] Apr 24 22:13:46.066950 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:46.066922 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" path="/var/lib/kubelet/pods/395d5fa0-85d1-4121-86c9-730d224a3ec9/volumes" Apr 24 22:13:46.101163 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:46.101133 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerStarted","Data":"cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b"} Apr 24 22:13:46.101319 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:46.101170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerStarted","Data":"ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd"} Apr 24 22:13:46.101408 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:46.101390 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:46.122236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:46.122195 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podStartSLOduration=6.12218243 podStartE2EDuration="6.12218243s" podCreationTimestamp="2026-04-24 22:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:13:46.120873555 +0000 UTC m=+2778.544380960" watchObservedRunningTime="2026-04-24 22:13:46.12218243 +0000 UTC m=+2778.545689837" Apr 24 22:13:47.104780 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:47.104747 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:47.105973 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:47.105940 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:13:48.106629 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:48.106592 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:13:53.111237 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:53.111208 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:13:53.111724 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:13:53.111695 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:14:03.112087 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:14:03.112011 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:14:13.111614 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:14:13.111573 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:14:23.112148 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:14:23.112100 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:14:33.112285 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:14:33.112248 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:14:43.111774 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:14:43.111736 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:14:53.112174 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:14:53.112140 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:15:00.637135 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.637083 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx"] Apr 24 22:15:00.637629 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.637494 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" containerID="cri-o://ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd" gracePeriod=30 Apr 24 22:15:00.637700 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.637569 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kube-rbac-proxy" containerID="cri-o://cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b" gracePeriod=30 Apr 24 22:15:00.722552 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722516 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j"] Apr 24 22:15:00.722815 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722803 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="storage-initializer" Apr 24 22:15:00.722859 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722817 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="storage-initializer" Apr 24 22:15:00.722859 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722833 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kube-rbac-proxy" Apr 24 22:15:00.722859 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722838 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kube-rbac-proxy" Apr 24 22:15:00.722859 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722847 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" Apr 24 22:15:00.722859 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722852 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" Apr 24 22:15:00.723013 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722896 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kube-rbac-proxy" Apr 24 22:15:00.723013 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.722907 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="395d5fa0-85d1-4121-86c9-730d224a3ec9" containerName="kserve-container" Apr 24 22:15:00.725722 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.725706 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.728007 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.727990 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 24 22:15:00.728113 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.728009 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 24 22:15:00.742601 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.742578 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j"] Apr 24 22:15:00.816984 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.816952 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxg8h\" (UniqueName: \"kubernetes.io/projected/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kube-api-access-gxg8h\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.817162 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.816991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.817162 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.817087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.817162 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.817103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.918308 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.918280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.918453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.918315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.918453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.918353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxg8h\" (UniqueName: \"kubernetes.io/projected/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kube-api-access-gxg8h\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.918453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.918401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.918585 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:15:00.918534 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: secret "isvc-tensorflow-predictor-serving-cert" not found Apr 24 22:15:00.918642 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:15:00.918611 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls podName:9f085049-2b98-4b5a-8813-b25cfcfd6bf6 nodeName:}" failed. No retries permitted until 2026-04-24 22:15:01.418588896 +0000 UTC m=+2853.842096296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-x588j" (UID: "9f085049-2b98-4b5a-8813-b25cfcfd6bf6") : secret "isvc-tensorflow-predictor-serving-cert" not found Apr 24 22:15:00.918809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.918789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.918983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.918967 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:00.929867 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:00.929836 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxg8h\" (UniqueName: \"kubernetes.io/projected/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kube-api-access-gxg8h\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:01.302887 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.302802 2570 generic.go:358] "Generic (PLEG): container finished" podID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerID="cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b" exitCode=2 Apr 24 22:15:01.302887 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.302854 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerDied","Data":"cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b"} Apr 24 22:15:01.423651 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.423604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:01.426135 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.426111 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-x588j\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:01.635837 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.635740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:01.763141 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.763102 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j"] Apr 24 22:15:01.766224 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:15:01.766181 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f085049_2b98_4b5a_8813_b25cfcfd6bf6.slice/crio-da7469f348228766cb7e0f1683cd3db35fcc8a4dd7bb415f4c382208a2b0b115 WatchSource:0}: Error finding container da7469f348228766cb7e0f1683cd3db35fcc8a4dd7bb415f4c382208a2b0b115: Status 404 returned error can't find the container with id da7469f348228766cb7e0f1683cd3db35fcc8a4dd7bb415f4c382208a2b0b115 Apr 24 22:15:01.767980 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:01.767963 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:15:02.307420 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:02.307383 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerStarted","Data":"c40a97e4630fbb3785aa477ef4d760a4cb91f180e4b9bdff7df0a4d6e582af09"} Apr 24 22:15:02.307420 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:02.307418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerStarted","Data":"da7469f348228766cb7e0f1683cd3db35fcc8a4dd7bb415f4c382208a2b0b115"} Apr 24 22:15:03.107420 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:03.107377 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.51:8643/healthz\": dial tcp 10.133.0.51:8643: connect: connection refused" Apr 24 22:15:03.111734 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:03.111709 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.51:8080: connect: connection refused" Apr 24 22:15:04.870220 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.870196 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:15:04.949542 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.949518 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98ce335e-5f51-4252-9f18-cb3599cfc0eb-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " Apr 24 22:15:04.949663 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.949550 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdt44\" (UniqueName: \"kubernetes.io/projected/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kube-api-access-qdt44\") pod \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " Apr 24 22:15:04.949663 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.949575 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kserve-provision-location\") pod \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " Apr 24 22:15:04.949741 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.949693 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98ce335e-5f51-4252-9f18-cb3599cfc0eb-proxy-tls\") pod \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\" (UID: \"98ce335e-5f51-4252-9f18-cb3599cfc0eb\") " Apr 24 22:15:04.949897 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.949867 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "98ce335e-5f51-4252-9f18-cb3599cfc0eb" (UID: "98ce335e-5f51-4252-9f18-cb3599cfc0eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:15:04.950015 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.949896 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ce335e-5f51-4252-9f18-cb3599cfc0eb-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "98ce335e-5f51-4252-9f18-cb3599cfc0eb" (UID: "98ce335e-5f51-4252-9f18-cb3599cfc0eb"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:15:04.951769 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.951748 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kube-api-access-qdt44" (OuterVolumeSpecName: "kube-api-access-qdt44") pod "98ce335e-5f51-4252-9f18-cb3599cfc0eb" (UID: "98ce335e-5f51-4252-9f18-cb3599cfc0eb"). InnerVolumeSpecName "kube-api-access-qdt44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:15:04.951851 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:04.951809 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98ce335e-5f51-4252-9f18-cb3599cfc0eb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "98ce335e-5f51-4252-9f18-cb3599cfc0eb" (UID: "98ce335e-5f51-4252-9f18-cb3599cfc0eb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:15:05.050361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.050286 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:15:05.050361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.050321 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/98ce335e-5f51-4252-9f18-cb3599cfc0eb-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:15:05.050361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.050330 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/98ce335e-5f51-4252-9f18-cb3599cfc0eb-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:15:05.050361 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.050342 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdt44\" (UniqueName: \"kubernetes.io/projected/98ce335e-5f51-4252-9f18-cb3599cfc0eb-kube-api-access-qdt44\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:15:05.320863 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.320771 2570 generic.go:358] "Generic (PLEG): container finished" podID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerID="ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd" exitCode=0 Apr 24 22:15:05.321002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.320858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerDied","Data":"ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd"} Apr 24 22:15:05.321002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.320883 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" Apr 24 22:15:05.321002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.320894 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx" event={"ID":"98ce335e-5f51-4252-9f18-cb3599cfc0eb","Type":"ContainerDied","Data":"8f2290baf08badb9d62ce846c9d0abfe333caaf15ecde866ca20d884f6f64139"} Apr 24 22:15:05.321002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.320914 2570 scope.go:117] "RemoveContainer" containerID="cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b" Apr 24 22:15:05.328723 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.328700 2570 scope.go:117] "RemoveContainer" containerID="ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd" Apr 24 22:15:05.335537 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.335522 2570 scope.go:117] "RemoveContainer" containerID="11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8" Apr 24 22:15:05.343174 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.343135 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx"] Apr 24 22:15:05.345449 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.345432 2570 scope.go:117] "RemoveContainer" containerID="cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b" Apr 24 22:15:05.346101 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:15:05.346064 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b\": container with ID starting with cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b not found: ID does not exist" containerID="cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b" Apr 24 22:15:05.346216 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.346129 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b"} err="failed to get container status \"cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b\": rpc error: code = NotFound desc = could not find container \"cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b\": container with ID starting with cb5fdab91615daaa7751b049e93f6814f1c34599cad9846d554f26f262fb358b not found: ID does not exist" Apr 24 22:15:05.346216 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.346157 2570 scope.go:117] "RemoveContainer" containerID="ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd" Apr 24 22:15:05.346635 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:15:05.346616 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd\": container with ID starting with ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd not found: ID does not exist" containerID="ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd" Apr 24 22:15:05.346721 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.346640 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd"} err="failed to get container status \"ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd\": rpc error: code = NotFound desc = could not find container \"ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd\": container with ID starting with ee5974112177cd79f136959972fee7592e6f8f2e6e15b81e80debd4a7d9606cd not found: ID does not exist" Apr 24 22:15:05.346721 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.346656 2570 scope.go:117] "RemoveContainer" containerID="11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8" Apr 24 22:15:05.346966 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:15:05.346946 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8\": container with ID starting with 11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8 not found: ID does not exist" containerID="11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8" Apr 24 22:15:05.347057 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.346975 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8"} err="failed to get container status \"11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8\": rpc error: code = NotFound desc = could not find container \"11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8\": container with ID starting with 11fb083990fe1d87fb234a12389ab6e8b355fd80ef6799564cb245d0457278b8 not found: ID does not exist" Apr 24 22:15:05.347135 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:05.347115 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8664c8b4d9-jwvhx"] Apr 24 22:15:06.067543 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:06.067511 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" path="/var/lib/kubelet/pods/98ce335e-5f51-4252-9f18-cb3599cfc0eb/volumes" Apr 24 22:15:07.329146 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:07.329115 2570 generic.go:358] "Generic (PLEG): container finished" podID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerID="c40a97e4630fbb3785aa477ef4d760a4cb91f180e4b9bdff7df0a4d6e582af09" exitCode=0 Apr 24 22:15:07.329528 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:07.329185 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerDied","Data":"c40a97e4630fbb3785aa477ef4d760a4cb91f180e4b9bdff7df0a4d6e582af09"} Apr 24 22:15:11.343102 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:11.343066 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerStarted","Data":"85dcc7e3f11606c1a8e779757737b4850acb6d368a2f6881c99b8eb87818aa11"} Apr 24 22:15:11.343102 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:11.343107 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerStarted","Data":"c39e950423a4406b60f6a99d04b9d666e348dc67ebdaca97efeea513ee994625"} Apr 24 22:15:11.343531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:11.343381 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:11.343531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:11.343502 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:11.344720 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:11.344696 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 24 22:15:11.363652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:11.363605 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podStartSLOduration=7.862179717 podStartE2EDuration="11.363594053s" podCreationTimestamp="2026-04-24 22:15:00 +0000 UTC" firstStartedPulling="2026-04-24 22:15:07.330359776 +0000 UTC m=+2859.753867161" lastFinishedPulling="2026-04-24 22:15:10.831774095 +0000 UTC m=+2863.255281497" observedRunningTime="2026-04-24 22:15:11.362445677 +0000 UTC m=+2863.785953083" watchObservedRunningTime="2026-04-24 22:15:11.363594053 +0000 UTC m=+2863.787101461" Apr 24 22:15:12.346161 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:12.346126 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 24 22:15:17.351335 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:17.351303 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:17.351951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:17.351926 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 24 22:15:27.352824 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:27.352795 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:41.381901 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.381870 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j"] Apr 24 22:15:41.382348 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.382268 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" containerID="cri-o://c39e950423a4406b60f6a99d04b9d666e348dc67ebdaca97efeea513ee994625" gracePeriod=30 Apr 24 22:15:41.382421 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.382339 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" containerID="cri-o://85dcc7e3f11606c1a8e779757737b4850acb6d368a2f6881c99b8eb87818aa11" gracePeriod=30 Apr 24 22:15:41.447653 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447604 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m"] Apr 24 22:15:41.447923 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447911 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kube-rbac-proxy" Apr 24 22:15:41.447966 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447925 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kube-rbac-proxy" Apr 24 22:15:41.447966 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447935 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" Apr 24 22:15:41.447966 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447941 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" Apr 24 22:15:41.447966 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447957 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="storage-initializer" Apr 24 22:15:41.447966 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.447963 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="storage-initializer" Apr 24 22:15:41.448157 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.448006 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kube-rbac-proxy" Apr 24 22:15:41.448157 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.448014 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="98ce335e-5f51-4252-9f18-cb3599cfc0eb" containerName="kserve-container" Apr 24 22:15:41.451074 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.451058 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.453459 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.453440 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 24 22:15:41.453627 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.453614 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:15:41.461989 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.461970 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m"] Apr 24 22:15:41.518295 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.518269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ptz\" (UniqueName: \"kubernetes.io/projected/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kube-api-access-q4ptz\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.518384 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.518307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e7bb452-7407-4043-bf44-0bd6c1e61bed-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.518384 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.518354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.518474 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.518434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e7bb452-7407-4043-bf44-0bd6c1e61bed-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.619095 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.619060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.619254 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.619117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e7bb452-7407-4043-bf44-0bd6c1e61bed-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.619254 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.619149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ptz\" (UniqueName: \"kubernetes.io/projected/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kube-api-access-q4ptz\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.619254 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.619172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e7bb452-7407-4043-bf44-0bd6c1e61bed-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.619506 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.619462 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.619829 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.619811 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e7bb452-7407-4043-bf44-0bd6c1e61bed-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.621620 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.621602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e7bb452-7407-4043-bf44-0bd6c1e61bed-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.627905 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.627886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ptz\" (UniqueName: \"kubernetes.io/projected/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kube-api-access-q4ptz\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.760926 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.760894 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:41.898782 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:41.898751 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m"] Apr 24 22:15:41.901608 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:15:41.901580 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7bb452_7407_4043_bf44_0bd6c1e61bed.slice/crio-876ef00bdbf1b38208690fc63e940b3b8ddc4357bab232f045f53798458a783c WatchSource:0}: Error finding container 876ef00bdbf1b38208690fc63e940b3b8ddc4357bab232f045f53798458a783c: Status 404 returned error can't find the container with id 876ef00bdbf1b38208690fc63e940b3b8ddc4357bab232f045f53798458a783c Apr 24 22:15:42.347242 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:42.347196 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 24 22:15:42.438101 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:42.438068 2570 generic.go:358] "Generic (PLEG): container finished" podID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerID="85dcc7e3f11606c1a8e779757737b4850acb6d368a2f6881c99b8eb87818aa11" exitCode=2 Apr 24 22:15:42.438556 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:42.438141 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerDied","Data":"85dcc7e3f11606c1a8e779757737b4850acb6d368a2f6881c99b8eb87818aa11"} Apr 24 22:15:42.439429 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:42.439408 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerStarted","Data":"017272c3334f16ccc022bd3c4eeb85400930e492126a28a87684879735fc99c9"} Apr 24 22:15:42.439429 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:42.439433 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerStarted","Data":"876ef00bdbf1b38208690fc63e940b3b8ddc4357bab232f045f53798458a783c"} Apr 24 22:15:46.450782 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:46.450699 2570 generic.go:358] "Generic (PLEG): container finished" podID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerID="017272c3334f16ccc022bd3c4eeb85400930e492126a28a87684879735fc99c9" exitCode=0 Apr 24 22:15:46.451163 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:46.450773 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerDied","Data":"017272c3334f16ccc022bd3c4eeb85400930e492126a28a87684879735fc99c9"} Apr 24 22:15:47.346895 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:47.346852 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 24 22:15:47.455546 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:47.455508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerStarted","Data":"82acaa1fbce85165f5a82ae2cef6e50be69929cb781522d3447177af4347b062"} Apr 24 22:15:47.455546 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:47.455545 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerStarted","Data":"64000a1749522b8fc08009149a5633f2fb69165b2dbaba9077bf53a9d4e0aabf"} Apr 24 22:15:47.456107 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:47.455772 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:47.474714 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:47.474663 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podStartSLOduration=6.474647774 podStartE2EDuration="6.474647774s" podCreationTimestamp="2026-04-24 22:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:15:47.47336174 +0000 UTC m=+2899.896869145" watchObservedRunningTime="2026-04-24 22:15:47.474647774 +0000 UTC m=+2899.898155250" Apr 24 22:15:48.458091 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:48.458063 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:48.459124 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:48.459098 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 24 22:15:49.461251 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:49.461208 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 24 22:15:52.346729 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:52.346685 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 24 22:15:52.347149 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:52.346807 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:15:54.464887 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:54.464860 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:15:54.465489 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:54.465463 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 24 22:15:57.346838 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:15:57.346792 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 24 22:16:02.346724 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:02.346681 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 24 22:16:04.465744 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:04.465718 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:16:07.346611 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:07.346567 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.52:8643/healthz\": dial tcp 10.133.0.52:8643: connect: connection refused" Apr 24 22:16:11.524489 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:11.524462 2570 generic.go:358] "Generic (PLEG): container finished" podID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerID="c39e950423a4406b60f6a99d04b9d666e348dc67ebdaca97efeea513ee994625" exitCode=137 Apr 24 22:16:11.524813 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:11.524501 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerDied","Data":"c39e950423a4406b60f6a99d04b9d666e348dc67ebdaca97efeea513ee994625"} Apr 24 22:16:12.026724 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.026700 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:16:12.151211 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.151122 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " Apr 24 22:16:12.151211 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.151168 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxg8h\" (UniqueName: \"kubernetes.io/projected/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kube-api-access-gxg8h\") pod \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " Apr 24 22:16:12.151426 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.151219 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kserve-provision-location\") pod \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " Apr 24 22:16:12.151426 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.151260 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls\") pod \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\" (UID: \"9f085049-2b98-4b5a-8813-b25cfcfd6bf6\") " Apr 24 22:16:12.151524 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.151502 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "9f085049-2b98-4b5a-8813-b25cfcfd6bf6" (UID: "9f085049-2b98-4b5a-8813-b25cfcfd6bf6"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:12.153485 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.153461 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9f085049-2b98-4b5a-8813-b25cfcfd6bf6" (UID: "9f085049-2b98-4b5a-8813-b25cfcfd6bf6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:12.153619 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.153604 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kube-api-access-gxg8h" (OuterVolumeSpecName: "kube-api-access-gxg8h") pod "9f085049-2b98-4b5a-8813-b25cfcfd6bf6" (UID: "9f085049-2b98-4b5a-8813-b25cfcfd6bf6"). InnerVolumeSpecName "kube-api-access-gxg8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:12.162542 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.162518 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f085049-2b98-4b5a-8813-b25cfcfd6bf6" (UID: "9f085049-2b98-4b5a-8813-b25cfcfd6bf6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:12.252377 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.252343 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:12.252377 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.252372 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:12.252377 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.252383 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxg8h\" (UniqueName: \"kubernetes.io/projected/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kube-api-access-gxg8h\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:12.252583 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.252393 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f085049-2b98-4b5a-8813-b25cfcfd6bf6-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:12.529289 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.529247 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" event={"ID":"9f085049-2b98-4b5a-8813-b25cfcfd6bf6","Type":"ContainerDied","Data":"da7469f348228766cb7e0f1683cd3db35fcc8a4dd7bb415f4c382208a2b0b115"} Apr 24 22:16:12.529289 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.529288 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j" Apr 24 22:16:12.529769 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.529295 2570 scope.go:117] "RemoveContainer" containerID="85dcc7e3f11606c1a8e779757737b4850acb6d368a2f6881c99b8eb87818aa11" Apr 24 22:16:12.538104 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.538086 2570 scope.go:117] "RemoveContainer" containerID="c39e950423a4406b60f6a99d04b9d666e348dc67ebdaca97efeea513ee994625" Apr 24 22:16:12.544850 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.544833 2570 scope.go:117] "RemoveContainer" containerID="c40a97e4630fbb3785aa477ef4d760a4cb91f180e4b9bdff7df0a4d6e582af09" Apr 24 22:16:12.552441 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.552422 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j"] Apr 24 22:16:12.556255 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:12.556235 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-x588j"] Apr 24 22:16:14.068517 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:14.068478 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" path="/var/lib/kubelet/pods/9f085049-2b98-4b5a-8813-b25cfcfd6bf6/volumes" Apr 24 22:16:22.158246 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.158214 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m"] Apr 24 22:16:22.158806 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.158501 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" containerID="cri-o://64000a1749522b8fc08009149a5633f2fb69165b2dbaba9077bf53a9d4e0aabf" gracePeriod=30 Apr 24 22:16:22.158806 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.158555 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" containerID="cri-o://82acaa1fbce85165f5a82ae2cef6e50be69929cb781522d3447177af4347b062" gracePeriod=30 Apr 24 22:16:22.259490 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259461 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d"] Apr 24 22:16:22.259719 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259708 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" Apr 24 22:16:22.259770 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259721 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" Apr 24 22:16:22.259770 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259735 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" Apr 24 22:16:22.259770 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259740 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" Apr 24 22:16:22.259770 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259749 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="storage-initializer" Apr 24 22:16:22.259770 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259754 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="storage-initializer" Apr 24 22:16:22.259927 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259807 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kube-rbac-proxy" Apr 24 22:16:22.259927 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.259815 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f085049-2b98-4b5a-8813-b25cfcfd6bf6" containerName="kserve-container" Apr 24 22:16:22.264370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.264349 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.266542 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.266520 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 24 22:16:22.266658 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.266580 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 24 22:16:22.272341 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.272319 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d"] Apr 24 22:16:22.325920 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.325891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qpb\" (UniqueName: \"kubernetes.io/projected/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kube-api-access-f6qpb\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.326109 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.325940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.326109 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.326050 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.326109 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.326086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.426835 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.426810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qpb\" (UniqueName: \"kubernetes.io/projected/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kube-api-access-f6qpb\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.426960 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.426846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.427091 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.427066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.427165 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.427127 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.427225 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.427183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.427679 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.427659 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.429636 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.429615 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.435545 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.435519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qpb\" (UniqueName: \"kubernetes.io/projected/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kube-api-access-f6qpb\") pod \"isvc-triton-predictor-84bb65d94b-fn44d\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.558655 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.558620 2570 generic.go:358] "Generic (PLEG): container finished" podID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerID="82acaa1fbce85165f5a82ae2cef6e50be69929cb781522d3447177af4347b062" exitCode=2 Apr 24 22:16:22.558810 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.558674 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerDied","Data":"82acaa1fbce85165f5a82ae2cef6e50be69929cb781522d3447177af4347b062"} Apr 24 22:16:22.575840 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.575813 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:16:22.696962 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:22.696887 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d"] Apr 24 22:16:22.699810 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:16:22.699785 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd6eac8_0564_4ff0_9e5f_ef797fabc0be.slice/crio-ec951d58216489fd210eadeef1f3a4aa151bb5f95cc1cf508df74a4f6b70df1d WatchSource:0}: Error finding container ec951d58216489fd210eadeef1f3a4aa151bb5f95cc1cf508df74a4f6b70df1d: Status 404 returned error can't find the container with id ec951d58216489fd210eadeef1f3a4aa151bb5f95cc1cf508df74a4f6b70df1d Apr 24 22:16:23.562983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:23.562946 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerStarted","Data":"920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268"} Apr 24 22:16:23.562983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:23.562981 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerStarted","Data":"ec951d58216489fd210eadeef1f3a4aa151bb5f95cc1cf508df74a4f6b70df1d"} Apr 24 22:16:24.462137 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:24.462099 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 24 22:16:27.575045 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:27.574993 2570 generic.go:358] "Generic (PLEG): container finished" podID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerID="920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268" exitCode=0 Apr 24 22:16:27.575509 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:27.575050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerDied","Data":"920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268"} Apr 24 22:16:29.461853 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:29.461790 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 24 22:16:34.461939 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:34.461871 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 24 22:16:34.462567 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:34.462043 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:16:39.462404 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:39.462114 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 24 22:16:44.462349 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:44.462170 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 24 22:16:49.462091 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:49.462047 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.53:8643/healthz\": dial tcp 10.133.0.53:8643: connect: connection refused" Apr 24 22:16:52.678123 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.678073 2570 generic.go:358] "Generic (PLEG): container finished" podID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerID="64000a1749522b8fc08009149a5633f2fb69165b2dbaba9077bf53a9d4e0aabf" exitCode=137 Apr 24 22:16:52.678676 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.678149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerDied","Data":"64000a1749522b8fc08009149a5633f2fb69165b2dbaba9077bf53a9d4e0aabf"} Apr 24 22:16:52.844205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.844162 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:16:52.990015 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.989923 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e7bb452-7407-4043-bf44-0bd6c1e61bed-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " Apr 24 22:16:52.990015 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.989981 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4ptz\" (UniqueName: \"kubernetes.io/projected/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kube-api-access-q4ptz\") pod \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " Apr 24 22:16:52.990280 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.990103 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kserve-provision-location\") pod \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " Apr 24 22:16:52.990280 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.990158 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e7bb452-7407-4043-bf44-0bd6c1e61bed-proxy-tls\") pod \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\" (UID: \"4e7bb452-7407-4043-bf44-0bd6c1e61bed\") " Apr 24 22:16:52.990471 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.990419 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e7bb452-7407-4043-bf44-0bd6c1e61bed-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "4e7bb452-7407-4043-bf44-0bd6c1e61bed" (UID: "4e7bb452-7407-4043-bf44-0bd6c1e61bed"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:52.993257 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.993204 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kube-api-access-q4ptz" (OuterVolumeSpecName: "kube-api-access-q4ptz") pod "4e7bb452-7407-4043-bf44-0bd6c1e61bed" (UID: "4e7bb452-7407-4043-bf44-0bd6c1e61bed"). InnerVolumeSpecName "kube-api-access-q4ptz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:52.993385 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.993313 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e7bb452-7407-4043-bf44-0bd6c1e61bed-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4e7bb452-7407-4043-bf44-0bd6c1e61bed" (UID: "4e7bb452-7407-4043-bf44-0bd6c1e61bed"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:52.996167 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:52.996144 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e7bb452-7407-4043-bf44-0bd6c1e61bed" (UID: "4e7bb452-7407-4043-bf44-0bd6c1e61bed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:53.091054 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.090972 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:53.091054 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.091009 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e7bb452-7407-4043-bf44-0bd6c1e61bed-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:53.091054 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.091046 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4e7bb452-7407-4043-bf44-0bd6c1e61bed-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:53.091342 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.091062 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4ptz\" (UniqueName: \"kubernetes.io/projected/4e7bb452-7407-4043-bf44-0bd6c1e61bed-kube-api-access-q4ptz\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:16:53.684530 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.684493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" event={"ID":"4e7bb452-7407-4043-bf44-0bd6c1e61bed","Type":"ContainerDied","Data":"876ef00bdbf1b38208690fc63e940b3b8ddc4357bab232f045f53798458a783c"} Apr 24 22:16:53.684530 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.684527 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m" Apr 24 22:16:53.685044 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.684546 2570 scope.go:117] "RemoveContainer" containerID="82acaa1fbce85165f5a82ae2cef6e50be69929cb781522d3447177af4347b062" Apr 24 22:16:53.697139 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.697113 2570 scope.go:117] "RemoveContainer" containerID="64000a1749522b8fc08009149a5633f2fb69165b2dbaba9077bf53a9d4e0aabf" Apr 24 22:16:53.707696 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.707673 2570 scope.go:117] "RemoveContainer" containerID="017272c3334f16ccc022bd3c4eeb85400930e492126a28a87684879735fc99c9" Apr 24 22:16:53.711060 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.711009 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m"] Apr 24 22:16:53.718462 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:53.718437 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-kzl4m"] Apr 24 22:16:54.068214 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:16:54.068131 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" path="/var/lib/kubelet/pods/4e7bb452-7407-4043-bf44-0bd6c1e61bed/volumes" Apr 24 22:18:21.126917 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:21.126840 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:18:21.126917 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:21.126842 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:18:22.958256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:22.958216 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerStarted","Data":"d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095"} Apr 24 22:18:22.958256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:22.958260 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerStarted","Data":"70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893"} Apr 24 22:18:22.958706 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:22.958370 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:18:22.991282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:22.991226 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" podStartSLOduration=6.147040119 podStartE2EDuration="2m0.99121039s" podCreationTimestamp="2026-04-24 22:16:22 +0000 UTC" firstStartedPulling="2026-04-24 22:16:27.576118983 +0000 UTC m=+2939.999626369" lastFinishedPulling="2026-04-24 22:18:22.420289251 +0000 UTC m=+3054.843796640" observedRunningTime="2026-04-24 22:18:22.989930596 +0000 UTC m=+3055.413438003" watchObservedRunningTime="2026-04-24 22:18:22.99121039 +0000 UTC m=+3055.414717797" Apr 24 22:18:23.960944 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:23.960914 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:18:23.962142 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:23.962117 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 24 22:18:24.964082 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:24.964011 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 24 22:18:29.968701 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:29.968671 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:18:29.969543 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:29.969525 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:18:34.680207 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.679944 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d"] Apr 24 22:18:34.702326 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.680316 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kserve-container" containerID="cri-o://70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893" gracePeriod=30 Apr 24 22:18:34.702326 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.680579 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kube-rbac-proxy" containerID="cri-o://d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095" gracePeriod=30 Apr 24 22:18:34.775781 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.775745 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg"] Apr 24 22:18:34.776106 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776090 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" Apr 24 22:18:34.776191 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776108 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" Apr 24 22:18:34.776191 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776124 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" Apr 24 22:18:34.776191 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776131 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" Apr 24 22:18:34.776191 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776147 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="storage-initializer" Apr 24 22:18:34.776191 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776155 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="storage-initializer" Apr 24 22:18:34.776448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776236 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kube-rbac-proxy" Apr 24 22:18:34.776448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.776251 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e7bb452-7407-4043-bf44-0bd6c1e61bed" containerName="kserve-container" Apr 24 22:18:34.796512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.796478 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg"] Apr 24 22:18:34.796654 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.796538 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:34.798590 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.798571 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 22:18:34.798691 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.798588 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 24 22:18:34.911943 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.911910 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e81be839-a938-4663-8282-3aa3ba5bf3df-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:34.912080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.911959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:34.912080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.912044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e81be839-a938-4663-8282-3aa3ba5bf3df-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:34.912168 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.912087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcxjh\" (UniqueName: \"kubernetes.io/projected/e81be839-a938-4663-8282-3aa3ba5bf3df-kube-api-access-xcxjh\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:34.964355 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.964286 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.54:8643/healthz\": dial tcp 10.133.0.54:8643: connect: connection refused" Apr 24 22:18:34.990995 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.990973 2570 generic.go:358] "Generic (PLEG): container finished" podID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerID="d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095" exitCode=2 Apr 24 22:18:34.991099 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:34.991052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerDied","Data":"d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095"} Apr 24 22:18:35.013319 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.013296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e81be839-a938-4663-8282-3aa3ba5bf3df-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.013397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.013338 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.013437 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.013388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e81be839-a938-4663-8282-3aa3ba5bf3df-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.013503 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.013433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcxjh\" (UniqueName: \"kubernetes.io/projected/e81be839-a938-4663-8282-3aa3ba5bf3df-kube-api-access-xcxjh\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.013561 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:18:35.013524 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 24 22:18:35.013617 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:18:35.013602 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls podName:e81be839-a938-4663-8282-3aa3ba5bf3df nodeName:}" failed. No retries permitted until 2026-04-24 22:18:35.51358264 +0000 UTC m=+3067.937090025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-fdbwg" (UID: "e81be839-a938-4663-8282-3aa3ba5bf3df") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 24 22:18:35.013734 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.013713 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e81be839-a938-4663-8282-3aa3ba5bf3df-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.014080 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.014060 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e81be839-a938-4663-8282-3aa3ba5bf3df-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.021889 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.021869 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcxjh\" (UniqueName: \"kubernetes.io/projected/e81be839-a938-4663-8282-3aa3ba5bf3df-kube-api-access-xcxjh\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.516974 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.516942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.519451 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.519423 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-fdbwg\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.706203 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.706175 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:35.839109 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.839075 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg"] Apr 24 22:18:35.842754 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:18:35.842728 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode81be839_a938_4663_8282_3aa3ba5bf3df.slice/crio-15fb6f6116b409225d7f6d231ae143952d5acd008d1732e02a78958ae749260a WatchSource:0}: Error finding container 15fb6f6116b409225d7f6d231ae143952d5acd008d1732e02a78958ae749260a: Status 404 returned error can't find the container with id 15fb6f6116b409225d7f6d231ae143952d5acd008d1732e02a78958ae749260a Apr 24 22:18:35.995145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.995113 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerStarted","Data":"3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae"} Apr 24 22:18:35.995252 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:35.995151 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerStarted","Data":"15fb6f6116b409225d7f6d231ae143952d5acd008d1732e02a78958ae749260a"} Apr 24 22:18:37.230196 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.230173 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:18:37.330194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.330120 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kserve-provision-location\") pod \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " Apr 24 22:18:37.330194 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.330169 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qpb\" (UniqueName: \"kubernetes.io/projected/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kube-api-access-f6qpb\") pod \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " Apr 24 22:18:37.330380 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.330204 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-proxy-tls\") pod \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " Apr 24 22:18:37.330380 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.330260 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-isvc-triton-kube-rbac-proxy-sar-config\") pod \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\" (UID: \"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be\") " Apr 24 22:18:37.330596 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.330571 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" (UID: "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:18:37.330675 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.330653 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" (UID: "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:18:37.332506 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.332488 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kube-api-access-f6qpb" (OuterVolumeSpecName: "kube-api-access-f6qpb") pod "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" (UID: "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be"). InnerVolumeSpecName "kube-api-access-f6qpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:18:37.332574 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.332506 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" (UID: "fcd6eac8-0564-4ff0-9e5f-ef797fabc0be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:18:37.431678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.431646 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:18:37.431678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.431673 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6qpb\" (UniqueName: \"kubernetes.io/projected/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-kube-api-access-f6qpb\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:18:37.431678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.431683 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:18:37.431900 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:37.431694 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:18:38.001947 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.001916 2570 generic.go:358] "Generic (PLEG): container finished" podID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerID="70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893" exitCode=0 Apr 24 22:18:38.002140 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.001978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerDied","Data":"70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893"} Apr 24 22:18:38.002140 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.001997 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" Apr 24 22:18:38.002140 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.002015 2570 scope.go:117] "RemoveContainer" containerID="d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095" Apr 24 22:18:38.002286 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.002005 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d" event={"ID":"fcd6eac8-0564-4ff0-9e5f-ef797fabc0be","Type":"ContainerDied","Data":"ec951d58216489fd210eadeef1f3a4aa151bb5f95cc1cf508df74a4f6b70df1d"} Apr 24 22:18:38.011160 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.011142 2570 scope.go:117] "RemoveContainer" containerID="70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893" Apr 24 22:18:38.017947 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.017931 2570 scope.go:117] "RemoveContainer" containerID="920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268" Apr 24 22:18:38.024960 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.024942 2570 scope.go:117] "RemoveContainer" containerID="d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095" Apr 24 22:18:38.025177 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.025160 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d"] Apr 24 22:18:38.025292 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:18:38.025271 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095\": container with ID starting with d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095 not found: ID does not exist" containerID="d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095" Apr 24 22:18:38.025363 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.025306 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095"} err="failed to get container status \"d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095\": rpc error: code = NotFound desc = could not find container \"d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095\": container with ID starting with d3bcfa67f5254a321d62117790ef2cbb2e657ddbe2efa47d10160d3ab4288095 not found: ID does not exist" Apr 24 22:18:38.025363 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.025331 2570 scope.go:117] "RemoveContainer" containerID="70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893" Apr 24 22:18:38.025611 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:18:38.025582 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893\": container with ID starting with 70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893 not found: ID does not exist" containerID="70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893" Apr 24 22:18:38.025729 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.025611 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893"} err="failed to get container status \"70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893\": rpc error: code = NotFound desc = could not find container \"70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893\": container with ID starting with 70c0649055070ba46cc62d14c82cd2b1e183ae709ac1db7fc423cb2928b2d893 not found: ID does not exist" Apr 24 22:18:38.025729 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.025630 2570 scope.go:117] "RemoveContainer" containerID="920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268" Apr 24 22:18:38.025882 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:18:38.025865 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268\": container with ID starting with 920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268 not found: ID does not exist" containerID="920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268" Apr 24 22:18:38.025924 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.025889 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268"} err="failed to get container status \"920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268\": rpc error: code = NotFound desc = could not find container \"920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268\": container with ID starting with 920fae082b1901fc501bdc79a6f09b8b4d2bd6fb578d0f2944f47f883c7bd268 not found: ID does not exist" Apr 24 22:18:38.029597 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.029578 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-fn44d"] Apr 24 22:18:38.066871 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:38.066836 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" path="/var/lib/kubelet/pods/fcd6eac8-0564-4ff0-9e5f-ef797fabc0be/volumes" Apr 24 22:18:40.010104 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:40.010017 2570 generic.go:358] "Generic (PLEG): container finished" podID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerID="3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae" exitCode=0 Apr 24 22:18:40.010437 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:40.010094 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerDied","Data":"3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae"} Apr 24 22:18:58.069593 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:58.069558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerStarted","Data":"a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b"} Apr 24 22:18:58.069593 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:58.069596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerStarted","Data":"490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec"} Apr 24 22:18:58.070089 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:58.069874 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:58.070089 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:58.069889 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:18:58.071263 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:58.071239 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:18:58.091571 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:58.091515 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podStartSLOduration=7.079027315 podStartE2EDuration="24.091500678s" podCreationTimestamp="2026-04-24 22:18:34 +0000 UTC" firstStartedPulling="2026-04-24 22:18:40.011276547 +0000 UTC m=+3072.434783933" lastFinishedPulling="2026-04-24 22:18:57.023749912 +0000 UTC m=+3089.447257296" observedRunningTime="2026-04-24 22:18:58.089836949 +0000 UTC m=+3090.513344355" watchObservedRunningTime="2026-04-24 22:18:58.091500678 +0000 UTC m=+3090.515008085" Apr 24 22:18:59.068926 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:18:59.068883 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:19:04.073415 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:04.073381 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:19:04.073979 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:04.073955 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:19:14.074953 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:14.074918 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:19:24.074493 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:24.074454 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:19:34.073929 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:34.073895 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:19:44.074158 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:44.074073 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:19:54.074485 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:19:54.074447 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 24 22:20:04.074827 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:04.074790 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:20:04.900245 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:04.900213 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg"] Apr 24 22:20:04.900518 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:04.900494 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" containerID="cri-o://490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec" gracePeriod=30 Apr 24 22:20:04.900600 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:04.900573 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kube-rbac-proxy" containerID="cri-o://a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b" gracePeriod=30 Apr 24 22:20:05.011370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011326 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr"] Apr 24 22:20:05.011743 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011724 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kserve-container" Apr 24 22:20:05.011790 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011749 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kserve-container" Apr 24 22:20:05.011790 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011765 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kube-rbac-proxy" Apr 24 22:20:05.011790 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011775 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kube-rbac-proxy" Apr 24 22:20:05.011885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011791 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="storage-initializer" Apr 24 22:20:05.011885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011801 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="storage-initializer" Apr 24 22:20:05.011947 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011888 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kserve-container" Apr 24 22:20:05.011947 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.011902 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fcd6eac8-0564-4ff0-9e5f-ef797fabc0be" containerName="kube-rbac-proxy" Apr 24 22:20:05.015204 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.015186 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.017379 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.017356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 22:20:05.017491 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.017389 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:20:05.024849 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.024825 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr"] Apr 24 22:20:05.094839 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.094795 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgxw\" (UniqueName: \"kubernetes.io/projected/52da8257-9166-41fc-976f-ac8e938fc217-kube-api-access-jbgxw\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.095306 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.094850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52da8257-9166-41fc-976f-ac8e938fc217-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.095306 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.094877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52da8257-9166-41fc-976f-ac8e938fc217-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.095306 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.094909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52da8257-9166-41fc-976f-ac8e938fc217-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.196328 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.196292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbgxw\" (UniqueName: \"kubernetes.io/projected/52da8257-9166-41fc-976f-ac8e938fc217-kube-api-access-jbgxw\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.196519 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.196351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52da8257-9166-41fc-976f-ac8e938fc217-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.196519 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.196375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52da8257-9166-41fc-976f-ac8e938fc217-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.196519 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.196407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52da8257-9166-41fc-976f-ac8e938fc217-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.196834 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.196813 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52da8257-9166-41fc-976f-ac8e938fc217-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.197105 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.197088 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52da8257-9166-41fc-976f-ac8e938fc217-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.199060 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.199042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52da8257-9166-41fc-976f-ac8e938fc217-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.204211 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.204187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbgxw\" (UniqueName: \"kubernetes.io/projected/52da8257-9166-41fc-976f-ac8e938fc217-kube-api-access-jbgxw\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.249402 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.249372 2570 generic.go:358] "Generic (PLEG): container finished" podID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerID="a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b" exitCode=2 Apr 24 22:20:05.249555 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.249448 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerDied","Data":"a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b"} Apr 24 22:20:05.324897 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.324858 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:05.447067 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.446964 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr"] Apr 24 22:20:05.449953 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:20:05.449927 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52da8257_9166_41fc_976f_ac8e938fc217.slice/crio-2522a11f2d376f2d6f1663f89fb8150840f9cd2b8a5a918faf57f780f5a1fc9b WatchSource:0}: Error finding container 2522a11f2d376f2d6f1663f89fb8150840f9cd2b8a5a918faf57f780f5a1fc9b: Status 404 returned error can't find the container with id 2522a11f2d376f2d6f1663f89fb8150840f9cd2b8a5a918faf57f780f5a1fc9b Apr 24 22:20:05.451658 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:05.451643 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:20:06.253946 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:06.253911 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerStarted","Data":"856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e"} Apr 24 22:20:06.253946 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:06.253949 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerStarted","Data":"2522a11f2d376f2d6f1663f89fb8150840f9cd2b8a5a918faf57f780f5a1fc9b"} Apr 24 22:20:08.635891 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.635869 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:20:08.724380 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.724283 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls\") pod \"e81be839-a938-4663-8282-3aa3ba5bf3df\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " Apr 24 22:20:08.724380 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.724340 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e81be839-a938-4663-8282-3aa3ba5bf3df-kserve-provision-location\") pod \"e81be839-a938-4663-8282-3aa3ba5bf3df\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " Apr 24 22:20:08.724604 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.724405 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e81be839-a938-4663-8282-3aa3ba5bf3df-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"e81be839-a938-4663-8282-3aa3ba5bf3df\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " Apr 24 22:20:08.724604 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.724448 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcxjh\" (UniqueName: \"kubernetes.io/projected/e81be839-a938-4663-8282-3aa3ba5bf3df-kube-api-access-xcxjh\") pod \"e81be839-a938-4663-8282-3aa3ba5bf3df\" (UID: \"e81be839-a938-4663-8282-3aa3ba5bf3df\") " Apr 24 22:20:08.724714 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.724686 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81be839-a938-4663-8282-3aa3ba5bf3df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e81be839-a938-4663-8282-3aa3ba5bf3df" (UID: "e81be839-a938-4663-8282-3aa3ba5bf3df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:20:08.724833 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.724807 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81be839-a938-4663-8282-3aa3ba5bf3df-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "e81be839-a938-4663-8282-3aa3ba5bf3df" (UID: "e81be839-a938-4663-8282-3aa3ba5bf3df"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:20:08.726679 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.726655 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81be839-a938-4663-8282-3aa3ba5bf3df-kube-api-access-xcxjh" (OuterVolumeSpecName: "kube-api-access-xcxjh") pod "e81be839-a938-4663-8282-3aa3ba5bf3df" (UID: "e81be839-a938-4663-8282-3aa3ba5bf3df"). InnerVolumeSpecName "kube-api-access-xcxjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:20:08.726762 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.726686 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e81be839-a938-4663-8282-3aa3ba5bf3df" (UID: "e81be839-a938-4663-8282-3aa3ba5bf3df"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:20:08.825625 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.825570 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e81be839-a938-4663-8282-3aa3ba5bf3df-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:20:08.825625 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.825618 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xcxjh\" (UniqueName: \"kubernetes.io/projected/e81be839-a938-4663-8282-3aa3ba5bf3df-kube-api-access-xcxjh\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:20:08.825625 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.825630 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e81be839-a938-4663-8282-3aa3ba5bf3df-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:20:08.825625 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:08.825646 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e81be839-a938-4663-8282-3aa3ba5bf3df-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:20:09.265598 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.265560 2570 generic.go:358] "Generic (PLEG): container finished" podID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerID="490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec" exitCode=0 Apr 24 22:20:09.265809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.265638 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerDied","Data":"490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec"} Apr 24 22:20:09.265809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.265675 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" event={"ID":"e81be839-a938-4663-8282-3aa3ba5bf3df","Type":"ContainerDied","Data":"15fb6f6116b409225d7f6d231ae143952d5acd008d1732e02a78958ae749260a"} Apr 24 22:20:09.265809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.265676 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg" Apr 24 22:20:09.265809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.265690 2570 scope.go:117] "RemoveContainer" containerID="a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b" Apr 24 22:20:09.274006 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.273988 2570 scope.go:117] "RemoveContainer" containerID="490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec" Apr 24 22:20:09.281195 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.281176 2570 scope.go:117] "RemoveContainer" containerID="3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae" Apr 24 22:20:09.288426 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.288409 2570 scope.go:117] "RemoveContainer" containerID="a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b" Apr 24 22:20:09.288763 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:20:09.288727 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b\": container with ID starting with a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b not found: ID does not exist" containerID="a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b" Apr 24 22:20:09.288898 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.288763 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b"} err="failed to get container status \"a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b\": rpc error: code = NotFound desc = could not find container \"a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b\": container with ID starting with a0acd83760609ff4f24add58c06a1cdeadfe1d012b877bd5470907a7f1214d6b not found: ID does not exist" Apr 24 22:20:09.288898 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.288787 2570 scope.go:117] "RemoveContainer" containerID="490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec" Apr 24 22:20:09.289458 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:20:09.289266 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec\": container with ID starting with 490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec not found: ID does not exist" containerID="490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec" Apr 24 22:20:09.289458 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.289303 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec"} err="failed to get container status \"490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec\": rpc error: code = NotFound desc = could not find container \"490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec\": container with ID starting with 490955503a8cb330962bbe6e3379d43ae3ba724379709d3cfa3d857dd7a388ec not found: ID does not exist" Apr 24 22:20:09.289458 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.289326 2570 scope.go:117] "RemoveContainer" containerID="3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae" Apr 24 22:20:09.289626 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:20:09.289572 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae\": container with ID starting with 3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae not found: ID does not exist" containerID="3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae" Apr 24 22:20:09.289626 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.289591 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae"} err="failed to get container status \"3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae\": rpc error: code = NotFound desc = could not find container \"3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae\": container with ID starting with 3b418500b08ce32fde8a0ebeeb483e3fe5479dcba15474f7e0171121f00e3bae not found: ID does not exist" Apr 24 22:20:09.290666 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.290644 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg"] Apr 24 22:20:09.294123 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:09.294105 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-fdbwg"] Apr 24 22:20:10.067243 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:10.067213 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" path="/var/lib/kubelet/pods/e81be839-a938-4663-8282-3aa3ba5bf3df/volumes" Apr 24 22:20:10.269789 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:10.269756 2570 generic.go:358] "Generic (PLEG): container finished" podID="52da8257-9166-41fc-976f-ac8e938fc217" containerID="856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e" exitCode=0 Apr 24 22:20:10.269972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:10.269834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerDied","Data":"856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e"} Apr 24 22:20:11.275509 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:11.275469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerStarted","Data":"46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e"} Apr 24 22:20:11.275885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:11.275521 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerStarted","Data":"24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114"} Apr 24 22:20:11.275885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:11.275732 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:11.296050 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:11.295977 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" podStartSLOduration=7.295957812 podStartE2EDuration="7.295957812s" podCreationTimestamp="2026-04-24 22:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:20:11.294889386 +0000 UTC m=+3163.718396794" watchObservedRunningTime="2026-04-24 22:20:11.295957812 +0000 UTC m=+3163.719465220" Apr 24 22:20:12.279266 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:12.279232 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:18.287964 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:18.287932 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:48.291685 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:48.291654 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:20:55.072332 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.072288 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr"] Apr 24 22:20:55.072764 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.072623 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kserve-container" containerID="cri-o://24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114" gracePeriod=30 Apr 24 22:20:55.072836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.072681 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kube-rbac-proxy" containerID="cri-o://46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e" gracePeriod=30 Apr 24 22:20:55.161720 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.161687 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr"] Apr 24 22:20:55.162075 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162056 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" Apr 24 22:20:55.162075 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162074 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" Apr 24 22:20:55.162176 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162098 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="storage-initializer" Apr 24 22:20:55.162176 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162105 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="storage-initializer" Apr 24 22:20:55.162176 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162122 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kube-rbac-proxy" Apr 24 22:20:55.162176 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162131 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kube-rbac-proxy" Apr 24 22:20:55.162301 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162186 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kube-rbac-proxy" Apr 24 22:20:55.162301 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.162199 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e81be839-a938-4663-8282-3aa3ba5bf3df" containerName="kserve-container" Apr 24 22:20:55.166835 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.166818 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.169148 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.169118 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 22:20:55.169280 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.169195 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 22:20:55.173357 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.173330 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr"] Apr 24 22:20:55.288079 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.288005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.288235 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.288141 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfj7p\" (UniqueName: \"kubernetes.io/projected/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kube-api-access-sfj7p\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.288235 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.288189 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.288235 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.288224 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.388879 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.388805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.388879 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.388861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.389108 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.389015 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfj7p\" (UniqueName: \"kubernetes.io/projected/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kube-api-access-sfj7p\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.389108 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.389090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.389394 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.389368 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.389734 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.389710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.391637 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.391606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.397633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.397607 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfj7p\" (UniqueName: \"kubernetes.io/projected/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kube-api-access-sfj7p\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w6fbr\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.406725 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.406695 2570 generic.go:358] "Generic (PLEG): container finished" podID="52da8257-9166-41fc-976f-ac8e938fc217" containerID="46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e" exitCode=2 Apr 24 22:20:55.406895 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.406771 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerDied","Data":"46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e"} Apr 24 22:20:55.477774 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.477739 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:20:55.603580 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:55.603549 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr"] Apr 24 22:20:55.606578 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:20:55.606551 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8230f143_d40e_4620_a6a3_5d8ce02e5fdf.slice/crio-fc7f57b5d56e0ff555716a31617c33a32051945fba92ceaf3d513ed10b0a5418 WatchSource:0}: Error finding container fc7f57b5d56e0ff555716a31617c33a32051945fba92ceaf3d513ed10b0a5418: Status 404 returned error can't find the container with id fc7f57b5d56e0ff555716a31617c33a32051945fba92ceaf3d513ed10b0a5418 Apr 24 22:20:56.411166 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:56.411121 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerStarted","Data":"0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091"} Apr 24 22:20:56.411166 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:56.411169 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerStarted","Data":"fc7f57b5d56e0ff555716a31617c33a32051945fba92ceaf3d513ed10b0a5418"} Apr 24 22:20:58.283001 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:58.282963 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.56:8643/healthz\": dial tcp 10.133.0.56:8643: connect: connection refused" Apr 24 22:20:58.289331 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:58.289303 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.56:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.133.0.56:8080: connect: connection refused" Apr 24 22:20:59.421011 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:59.420980 2570 generic.go:358] "Generic (PLEG): container finished" podID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerID="0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091" exitCode=0 Apr 24 22:20:59.421011 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:20:59.421057 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerDied","Data":"0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091"} Apr 24 22:21:00.426807 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:00.426769 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerStarted","Data":"4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade"} Apr 24 22:21:00.426807 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:00.426813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerStarted","Data":"4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830"} Apr 24 22:21:00.427256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:00.427053 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:21:00.427256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:00.427118 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:21:00.446977 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:00.446927 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" podStartSLOduration=5.446913796 podStartE2EDuration="5.446913796s" podCreationTimestamp="2026-04-24 22:20:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:21:00.445146434 +0000 UTC m=+3212.868653841" watchObservedRunningTime="2026-04-24 22:21:00.446913796 +0000 UTC m=+3212.870421200" Apr 24 22:21:01.914819 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.914797 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:21:01.941955 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.941927 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52da8257-9166-41fc-976f-ac8e938fc217-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"52da8257-9166-41fc-976f-ac8e938fc217\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " Apr 24 22:21:01.942155 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.941978 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52da8257-9166-41fc-976f-ac8e938fc217-proxy-tls\") pod \"52da8257-9166-41fc-976f-ac8e938fc217\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " Apr 24 22:21:01.942155 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.942012 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbgxw\" (UniqueName: \"kubernetes.io/projected/52da8257-9166-41fc-976f-ac8e938fc217-kube-api-access-jbgxw\") pod \"52da8257-9166-41fc-976f-ac8e938fc217\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " Apr 24 22:21:01.942155 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.942073 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52da8257-9166-41fc-976f-ac8e938fc217-kserve-provision-location\") pod \"52da8257-9166-41fc-976f-ac8e938fc217\" (UID: \"52da8257-9166-41fc-976f-ac8e938fc217\") " Apr 24 22:21:01.942384 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.942332 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52da8257-9166-41fc-976f-ac8e938fc217-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "52da8257-9166-41fc-976f-ac8e938fc217" (UID: "52da8257-9166-41fc-976f-ac8e938fc217"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:21:01.942445 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.942417 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52da8257-9166-41fc-976f-ac8e938fc217-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "52da8257-9166-41fc-976f-ac8e938fc217" (UID: "52da8257-9166-41fc-976f-ac8e938fc217"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:21:01.944376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.944342 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52da8257-9166-41fc-976f-ac8e938fc217-kube-api-access-jbgxw" (OuterVolumeSpecName: "kube-api-access-jbgxw") pod "52da8257-9166-41fc-976f-ac8e938fc217" (UID: "52da8257-9166-41fc-976f-ac8e938fc217"). InnerVolumeSpecName "kube-api-access-jbgxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:21:01.944486 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:01.944396 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52da8257-9166-41fc-976f-ac8e938fc217-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "52da8257-9166-41fc-976f-ac8e938fc217" (UID: "52da8257-9166-41fc-976f-ac8e938fc217"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:21:02.042608 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.042528 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52da8257-9166-41fc-976f-ac8e938fc217-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:02.042608 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.042555 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/52da8257-9166-41fc-976f-ac8e938fc217-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:02.042608 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.042567 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/52da8257-9166-41fc-976f-ac8e938fc217-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:02.042608 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.042576 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jbgxw\" (UniqueName: \"kubernetes.io/projected/52da8257-9166-41fc-976f-ac8e938fc217-kube-api-access-jbgxw\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:02.433951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.433915 2570 generic.go:358] "Generic (PLEG): container finished" podID="52da8257-9166-41fc-976f-ac8e938fc217" containerID="24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114" exitCode=0 Apr 24 22:21:02.434138 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.433990 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" Apr 24 22:21:02.434138 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.433995 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerDied","Data":"24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114"} Apr 24 22:21:02.434138 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.434054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr" event={"ID":"52da8257-9166-41fc-976f-ac8e938fc217","Type":"ContainerDied","Data":"2522a11f2d376f2d6f1663f89fb8150840f9cd2b8a5a918faf57f780f5a1fc9b"} Apr 24 22:21:02.434138 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.434072 2570 scope.go:117] "RemoveContainer" containerID="46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e" Apr 24 22:21:02.441959 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.441942 2570 scope.go:117] "RemoveContainer" containerID="24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114" Apr 24 22:21:02.449798 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.449778 2570 scope.go:117] "RemoveContainer" containerID="856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e" Apr 24 22:21:02.452259 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.452235 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr"] Apr 24 22:21:02.457999 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.457974 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-bdzgr"] Apr 24 22:21:02.458278 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.458254 2570 scope.go:117] "RemoveContainer" containerID="46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e" Apr 24 22:21:02.458533 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:02.458515 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e\": container with ID starting with 46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e not found: ID does not exist" containerID="46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e" Apr 24 22:21:02.458607 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.458546 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e"} err="failed to get container status \"46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e\": rpc error: code = NotFound desc = could not find container \"46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e\": container with ID starting with 46234694e22ec0ee8bf575a0f7175f274b76ce2288965a7e14f832d83b1a698e not found: ID does not exist" Apr 24 22:21:02.458607 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.458571 2570 scope.go:117] "RemoveContainer" containerID="24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114" Apr 24 22:21:02.458829 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:02.458814 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114\": container with ID starting with 24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114 not found: ID does not exist" containerID="24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114" Apr 24 22:21:02.458891 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.458836 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114"} err="failed to get container status \"24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114\": rpc error: code = NotFound desc = could not find container \"24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114\": container with ID starting with 24ff5461415217b0b44f82bf7d491dbb7edc155a9ab60695d63299366afcb114 not found: ID does not exist" Apr 24 22:21:02.458891 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.458856 2570 scope.go:117] "RemoveContainer" containerID="856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e" Apr 24 22:21:02.459155 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:02.459137 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e\": container with ID starting with 856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e not found: ID does not exist" containerID="856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e" Apr 24 22:21:02.459208 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:02.459161 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e"} err="failed to get container status \"856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e\": rpc error: code = NotFound desc = could not find container \"856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e\": container with ID starting with 856b4a5cdb746a158fe55bf70662dbbc5764b9208234702f0daebca94fa8af4e not found: ID does not exist" Apr 24 22:21:04.066738 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:04.066708 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52da8257-9166-41fc-976f-ac8e938fc217" path="/var/lib/kubelet/pods/52da8257-9166-41fc-976f-ac8e938fc217/volumes" Apr 24 22:21:06.434566 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:06.434539 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:21:36.438673 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:36.438645 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:21:45.279605 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.279564 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr"] Apr 24 22:21:45.280284 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.279993 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kserve-container" containerID="cri-o://4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830" gracePeriod=30 Apr 24 22:21:45.280284 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.280061 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kube-rbac-proxy" containerID="cri-o://4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade" gracePeriod=30 Apr 24 22:21:45.376983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.376950 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf"] Apr 24 22:21:45.377296 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377284 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kube-rbac-proxy" Apr 24 22:21:45.377345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377298 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kube-rbac-proxy" Apr 24 22:21:45.377345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377316 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="storage-initializer" Apr 24 22:21:45.377345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377322 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="storage-initializer" Apr 24 22:21:45.377345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377333 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kserve-container" Apr 24 22:21:45.377345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377339 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kserve-container" Apr 24 22:21:45.377501 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377380 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kube-rbac-proxy" Apr 24 22:21:45.377501 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.377391 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="52da8257-9166-41fc-976f-ac8e938fc217" containerName="kserve-container" Apr 24 22:21:45.381741 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.381724 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.384258 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.384231 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 24 22:21:45.384424 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.384238 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:21:45.393050 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.392998 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf"] Apr 24 22:21:45.470708 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.470668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.470914 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.470752 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.470914 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.470778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.470914 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.470804 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l4fr\" (UniqueName: \"kubernetes.io/projected/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kube-api-access-5l4fr\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.558397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.558302 2570 generic.go:358] "Generic (PLEG): container finished" podID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerID="4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade" exitCode=2 Apr 24 22:21:45.558559 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.558386 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerDied","Data":"4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade"} Apr 24 22:21:45.571728 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.571695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.571808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.571738 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.571808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.571771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5l4fr\" (UniqueName: \"kubernetes.io/projected/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kube-api-access-5l4fr\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.571808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.571796 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.571938 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:45.571839 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 24 22:21:45.571938 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:45.571907 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls podName:66a06231-6cc9-4a1a-a9bb-c1a236fd0485 nodeName:}" failed. No retries permitted until 2026-04-24 22:21:46.071889899 +0000 UTC m=+3258.495397284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-h7rxf" (UID: "66a06231-6cc9-4a1a-a9bb-c1a236fd0485") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 24 22:21:45.572275 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.572255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.572567 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.572545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:45.580907 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:45.580886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l4fr\" (UniqueName: \"kubernetes.io/projected/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kube-api-access-5l4fr\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:46.076795 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.076758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:46.079412 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.079379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-h7rxf\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:46.292368 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.292329 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:46.429817 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.429778 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf"] Apr 24 22:21:46.430085 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.430057 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.57:8643/healthz\": dial tcp 10.133.0.57:8643: connect: connection refused" Apr 24 22:21:46.433073 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:21:46.433041 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66a06231_6cc9_4a1a_a9bb_c1a236fd0485.slice/crio-7ea46bb79c62fd781ccb41ed5f7dcda6e41b0778a2cee28c446e58b01a77683a WatchSource:0}: Error finding container 7ea46bb79c62fd781ccb41ed5f7dcda6e41b0778a2cee28c446e58b01a77683a: Status 404 returned error can't find the container with id 7ea46bb79c62fd781ccb41ed5f7dcda6e41b0778a2cee28c446e58b01a77683a Apr 24 22:21:46.562595 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.562552 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerStarted","Data":"94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a"} Apr 24 22:21:46.562595 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:46.562603 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerStarted","Data":"7ea46bb79c62fd781ccb41ed5f7dcda6e41b0778a2cee28c446e58b01a77683a"} Apr 24 22:21:47.476258 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:47.476209 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/xgboost-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 22:21:50.577082 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:50.577014 2570 generic.go:358] "Generic (PLEG): container finished" podID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerID="94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a" exitCode=0 Apr 24 22:21:50.577585 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:50.577096 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerDied","Data":"94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a"} Apr 24 22:21:51.430463 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.430413 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.57:8643/healthz\": dial tcp 10.133.0.57:8643: connect: connection refused" Apr 24 22:21:51.581832 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.581796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerStarted","Data":"060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708"} Apr 24 22:21:51.581832 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.581835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerStarted","Data":"7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d"} Apr 24 22:21:51.582322 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.582124 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:51.582322 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.582267 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:51.583464 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.583441 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:21:51.603119 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:51.603073 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podStartSLOduration=6.603060963 podStartE2EDuration="6.603060963s" podCreationTimestamp="2026-04-24 22:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:21:51.602914764 +0000 UTC m=+3264.026422174" watchObservedRunningTime="2026-04-24 22:21:51.603060963 +0000 UTC m=+3264.026568370" Apr 24 22:21:52.021291 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.021269 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:21:52.124446 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.124358 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfj7p\" (UniqueName: \"kubernetes.io/projected/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kube-api-access-sfj7p\") pod \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " Apr 24 22:21:52.124446 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.124447 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kserve-provision-location\") pod \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " Apr 24 22:21:52.124683 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.124485 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " Apr 24 22:21:52.124683 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.124520 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-proxy-tls\") pod \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\" (UID: \"8230f143-d40e-4620-a6a3-5d8ce02e5fdf\") " Apr 24 22:21:52.124929 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.124900 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "8230f143-d40e-4620-a6a3-5d8ce02e5fdf" (UID: "8230f143-d40e-4620-a6a3-5d8ce02e5fdf"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:21:52.125005 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.124978 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8230f143-d40e-4620-a6a3-5d8ce02e5fdf" (UID: "8230f143-d40e-4620-a6a3-5d8ce02e5fdf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:21:52.126798 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.126760 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kube-api-access-sfj7p" (OuterVolumeSpecName: "kube-api-access-sfj7p") pod "8230f143-d40e-4620-a6a3-5d8ce02e5fdf" (UID: "8230f143-d40e-4620-a6a3-5d8ce02e5fdf"). InnerVolumeSpecName "kube-api-access-sfj7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:21:52.126911 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.126866 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8230f143-d40e-4620-a6a3-5d8ce02e5fdf" (UID: "8230f143-d40e-4620-a6a3-5d8ce02e5fdf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:21:52.224987 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.224947 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:52.224987 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.224977 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sfj7p\" (UniqueName: \"kubernetes.io/projected/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kube-api-access-sfj7p\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:52.224987 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.224988 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:52.224987 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.224998 2570 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8230f143-d40e-4620-a6a3-5d8ce02e5fdf-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:21:52.586445 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.586411 2570 generic.go:358] "Generic (PLEG): container finished" podID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerID="4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830" exitCode=0 Apr 24 22:21:52.586870 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.586492 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" Apr 24 22:21:52.586870 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.586498 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerDied","Data":"4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830"} Apr 24 22:21:52.586870 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.586535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr" event={"ID":"8230f143-d40e-4620-a6a3-5d8ce02e5fdf","Type":"ContainerDied","Data":"fc7f57b5d56e0ff555716a31617c33a32051945fba92ceaf3d513ed10b0a5418"} Apr 24 22:21:52.586870 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.586551 2570 scope.go:117] "RemoveContainer" containerID="4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade" Apr 24 22:21:52.587244 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.587212 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:21:52.595267 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.595250 2570 scope.go:117] "RemoveContainer" containerID="4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830" Apr 24 22:21:52.602686 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.602663 2570 scope.go:117] "RemoveContainer" containerID="0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091" Apr 24 22:21:52.607274 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.607247 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr"] Apr 24 22:21:52.610094 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.610074 2570 scope.go:117] "RemoveContainer" containerID="4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade" Apr 24 22:21:52.610341 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:52.610325 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade\": container with ID starting with 4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade not found: ID does not exist" containerID="4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade" Apr 24 22:21:52.610397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.610349 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade"} err="failed to get container status \"4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade\": rpc error: code = NotFound desc = could not find container \"4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade\": container with ID starting with 4bc33cf374d9eea8728ad43021d51acc62497fd1c14a8376b8b829ea828f8ade not found: ID does not exist" Apr 24 22:21:52.610397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.610365 2570 scope.go:117] "RemoveContainer" containerID="4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830" Apr 24 22:21:52.610625 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:52.610604 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830\": container with ID starting with 4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830 not found: ID does not exist" containerID="4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830" Apr 24 22:21:52.610719 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.610630 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830"} err="failed to get container status \"4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830\": rpc error: code = NotFound desc = could not find container \"4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830\": container with ID starting with 4f16cd54bf075d2cdd90ce028dbeb9887bcfad835e2b405137f480a101dcd830 not found: ID does not exist" Apr 24 22:21:52.610719 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.610647 2570 scope.go:117] "RemoveContainer" containerID="0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091" Apr 24 22:21:52.610936 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:21:52.610899 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091\": container with ID starting with 0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091 not found: ID does not exist" containerID="0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091" Apr 24 22:21:52.611054 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.610940 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091"} err="failed to get container status \"0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091\": rpc error: code = NotFound desc = could not find container \"0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091\": container with ID starting with 0165bce9e3faf2e26c01bc04391a5e679eeacf9e1ddfc8f54d82822215530091 not found: ID does not exist" Apr 24 22:21:52.612652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:52.612631 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w6fbr"] Apr 24 22:21:54.066862 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:54.066832 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" path="/var/lib/kubelet/pods/8230f143-d40e-4620-a6a3-5d8ce02e5fdf/volumes" Apr 24 22:21:57.591816 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:57.591786 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:21:57.592443 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:21:57.592409 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:22:07.592597 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:22:07.592553 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:22:17.592490 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:22:17.592444 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:22:27.592846 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:22:27.592807 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:22:37.593228 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:22:37.593178 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:22:47.592475 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:22:47.592429 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:22:57.593203 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:22:57.593163 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:23:05.408649 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.408616 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf"] Apr 24 22:23:05.409098 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.408933 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" containerID="cri-o://7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d" gracePeriod=30 Apr 24 22:23:05.409098 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.408987 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kube-rbac-proxy" containerID="cri-o://060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708" gracePeriod=30 Apr 24 22:23:05.479964 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.479931 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t"] Apr 24 22:23:05.480260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480247 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kube-rbac-proxy" Apr 24 22:23:05.480321 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480263 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kube-rbac-proxy" Apr 24 22:23:05.480321 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480281 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kserve-container" Apr 24 22:23:05.480321 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480287 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kserve-container" Apr 24 22:23:05.480321 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480296 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="storage-initializer" Apr 24 22:23:05.480321 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480302 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="storage-initializer" Apr 24 22:23:05.480478 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480346 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kserve-container" Apr 24 22:23:05.480478 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.480355 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8230f143-d40e-4620-a6a3-5d8ce02e5fdf" containerName="kube-rbac-proxy" Apr 24 22:23:05.483261 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.483246 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.485533 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.485510 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 24 22:23:05.485737 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.485716 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 22:23:05.491715 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.491691 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t"] Apr 24 22:23:05.548457 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.548431 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.548592 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.548471 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.548592 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.548494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtnhp\" (UniqueName: \"kubernetes.io/projected/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kube-api-access-gtnhp\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.548592 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.548577 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.649543 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.649515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.649543 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.649549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtnhp\" (UniqueName: \"kubernetes.io/projected/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kube-api-access-gtnhp\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.649749 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.649588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.649749 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.649623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.650107 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.650074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.650373 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.650351 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.652176 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.652151 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.657296 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.657267 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtnhp\" (UniqueName: \"kubernetes.io/projected/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kube-api-access-gtnhp\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.794495 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.794460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:05.803523 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.803498 2570 generic.go:358] "Generic (PLEG): container finished" podID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerID="060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708" exitCode=2 Apr 24 22:23:05.803632 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.803551 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerDied","Data":"060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708"} Apr 24 22:23:05.917307 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:05.917282 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t"] Apr 24 22:23:05.920321 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:23:05.920291 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca63ae87_4568_4b94_8f99_cf09ee0f52b5.slice/crio-b13aa3a0ab61ca6ed094845dd1432c7ff89f87b167d0c097cac50ee13414fa81 WatchSource:0}: Error finding container b13aa3a0ab61ca6ed094845dd1432c7ff89f87b167d0c097cac50ee13414fa81: Status 404 returned error can't find the container with id b13aa3a0ab61ca6ed094845dd1432c7ff89f87b167d0c097cac50ee13414fa81 Apr 24 22:23:06.807531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:06.807494 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerStarted","Data":"2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0"} Apr 24 22:23:06.807531 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:06.807530 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerStarted","Data":"b13aa3a0ab61ca6ed094845dd1432c7ff89f87b167d0c097cac50ee13414fa81"} Apr 24 22:23:07.588370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:07.588322 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.58:8643/healthz\": dial tcp 10.133.0.58:8643: connect: connection refused" Apr 24 22:23:07.593219 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:07.593183 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 24 22:23:09.255301 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.255278 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:23:09.380188 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.380095 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls\") pod \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " Apr 24 22:23:09.380188 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.380175 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " Apr 24 22:23:09.380428 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.380207 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l4fr\" (UniqueName: \"kubernetes.io/projected/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kube-api-access-5l4fr\") pod \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " Apr 24 22:23:09.380428 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.380284 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kserve-provision-location\") pod \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\" (UID: \"66a06231-6cc9-4a1a-a9bb-c1a236fd0485\") " Apr 24 22:23:09.380634 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.380582 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "66a06231-6cc9-4a1a-a9bb-c1a236fd0485" (UID: "66a06231-6cc9-4a1a-a9bb-c1a236fd0485"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:23:09.380938 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.380783 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66a06231-6cc9-4a1a-a9bb-c1a236fd0485" (UID: "66a06231-6cc9-4a1a-a9bb-c1a236fd0485"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:23:09.382560 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.382538 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "66a06231-6cc9-4a1a-a9bb-c1a236fd0485" (UID: "66a06231-6cc9-4a1a-a9bb-c1a236fd0485"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:23:09.382656 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.382641 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kube-api-access-5l4fr" (OuterVolumeSpecName: "kube-api-access-5l4fr") pod "66a06231-6cc9-4a1a-a9bb-c1a236fd0485" (UID: "66a06231-6cc9-4a1a-a9bb-c1a236fd0485"). InnerVolumeSpecName "kube-api-access-5l4fr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:23:09.481453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.481402 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:23:09.481453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.481446 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5l4fr\" (UniqueName: \"kubernetes.io/projected/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kube-api-access-5l4fr\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:23:09.481453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.481458 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:23:09.481704 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.481466 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/66a06231-6cc9-4a1a-a9bb-c1a236fd0485-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:23:09.817036 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.816998 2570 generic.go:358] "Generic (PLEG): container finished" podID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerID="2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0" exitCode=0 Apr 24 22:23:09.817227 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.817074 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerDied","Data":"2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0"} Apr 24 22:23:09.818870 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.818848 2570 generic.go:358] "Generic (PLEG): container finished" podID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerID="7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d" exitCode=0 Apr 24 22:23:09.818959 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.818885 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerDied","Data":"7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d"} Apr 24 22:23:09.818959 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.818911 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" event={"ID":"66a06231-6cc9-4a1a-a9bb-c1a236fd0485","Type":"ContainerDied","Data":"7ea46bb79c62fd781ccb41ed5f7dcda6e41b0778a2cee28c446e58b01a77683a"} Apr 24 22:23:09.818959 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.818928 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf" Apr 24 22:23:09.819089 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.818929 2570 scope.go:117] "RemoveContainer" containerID="060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708" Apr 24 22:23:09.830976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.830957 2570 scope.go:117] "RemoveContainer" containerID="7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d" Apr 24 22:23:09.838754 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.838734 2570 scope.go:117] "RemoveContainer" containerID="94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a" Apr 24 22:23:09.847922 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.847898 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf"] Apr 24 22:23:09.854115 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.854090 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-h7rxf"] Apr 24 22:23:09.859291 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.859275 2570 scope.go:117] "RemoveContainer" containerID="060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708" Apr 24 22:23:09.859589 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:23:09.859567 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708\": container with ID starting with 060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708 not found: ID does not exist" containerID="060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708" Apr 24 22:23:09.859645 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.859599 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708"} err="failed to get container status \"060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708\": rpc error: code = NotFound desc = could not find container \"060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708\": container with ID starting with 060bbcf9a1db89e615a3bc6a30ceee677e4d9f28939c0caaf342337c9d0e4708 not found: ID does not exist" Apr 24 22:23:09.859645 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.859618 2570 scope.go:117] "RemoveContainer" containerID="7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d" Apr 24 22:23:09.859926 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:23:09.859905 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d\": container with ID starting with 7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d not found: ID does not exist" containerID="7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d" Apr 24 22:23:09.860048 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.859936 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d"} err="failed to get container status \"7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d\": rpc error: code = NotFound desc = could not find container \"7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d\": container with ID starting with 7a4a3b14a4627c330559fcd2d18b00a4dbac2f027cf5cfca1c37d82e7e148d7d not found: ID does not exist" Apr 24 22:23:09.860048 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.859959 2570 scope.go:117] "RemoveContainer" containerID="94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a" Apr 24 22:23:09.860265 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:23:09.860247 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a\": container with ID starting with 94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a not found: ID does not exist" containerID="94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a" Apr 24 22:23:09.860323 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:09.860275 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a"} err="failed to get container status \"94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a\": rpc error: code = NotFound desc = could not find container \"94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a\": container with ID starting with 94d5a3624a3a8785a6b2de803fc7835bd964942807104950164d0d584535a28a not found: ID does not exist" Apr 24 22:23:10.068251 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:10.068167 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" path="/var/lib/kubelet/pods/66a06231-6cc9-4a1a-a9bb-c1a236fd0485/volumes" Apr 24 22:23:10.824181 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:10.824144 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerStarted","Data":"57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c"} Apr 24 22:23:10.824181 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:10.824179 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerStarted","Data":"e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58"} Apr 24 22:23:10.824633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:10.824408 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:10.824633 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:10.824475 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:10.844448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:10.844396 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podStartSLOduration=5.844382744 podStartE2EDuration="5.844382744s" podCreationTimestamp="2026-04-24 22:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:23:10.842934532 +0000 UTC m=+3343.266441939" watchObservedRunningTime="2026-04-24 22:23:10.844382744 +0000 UTC m=+3343.267890150" Apr 24 22:23:16.832498 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:16.832467 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:23:21.145382 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:21.145351 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:23:21.147521 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:21.147499 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:23:46.918461 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:46.918409 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 22:23:56.835278 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:23:56.835248 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:24:05.571543 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.571510 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t"] Apr 24 22:24:05.572087 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.571923 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kserve-container" containerID="cri-o://e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58" gracePeriod=30 Apr 24 22:24:05.572087 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.571965 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kube-rbac-proxy" containerID="cri-o://57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c" gracePeriod=30 Apr 24 22:24:05.648105 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648063 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49"] Apr 24 22:24:05.648463 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648447 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="storage-initializer" Apr 24 22:24:05.648550 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648469 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="storage-initializer" Apr 24 22:24:05.648550 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648490 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" Apr 24 22:24:05.648550 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648499 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" Apr 24 22:24:05.648550 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648512 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kube-rbac-proxy" Apr 24 22:24:05.648550 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648520 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kube-rbac-proxy" Apr 24 22:24:05.648858 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648606 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kserve-container" Apr 24 22:24:05.648858 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.648618 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="66a06231-6cc9-4a1a-a9bb-c1a236fd0485" containerName="kube-rbac-proxy" Apr 24 22:24:05.651891 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.651863 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.654092 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.654071 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 24 22:24:05.654201 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.654180 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 22:24:05.661710 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.661692 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49"] Apr 24 22:24:05.799782 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.799748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b156735-4d33-43ad-9a66-faf4c1813a90-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.799978 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.799796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b156735-4d33-43ad-9a66-faf4c1813a90-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.799978 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.799912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b156735-4d33-43ad-9a66-faf4c1813a90-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.799978 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.799972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vgt\" (UniqueName: \"kubernetes.io/projected/0b156735-4d33-43ad-9a66-faf4c1813a90-kube-api-access-z7vgt\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.901346 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.901258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b156735-4d33-43ad-9a66-faf4c1813a90-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.901346 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.901309 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b156735-4d33-43ad-9a66-faf4c1813a90-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.901547 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.901356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b156735-4d33-43ad-9a66-faf4c1813a90-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.901547 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.901395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vgt\" (UniqueName: \"kubernetes.io/projected/0b156735-4d33-43ad-9a66-faf4c1813a90-kube-api-access-z7vgt\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.901794 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.901774 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b156735-4d33-43ad-9a66-faf4c1813a90-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.902017 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.901997 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b156735-4d33-43ad-9a66-faf4c1813a90-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.904051 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.904012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b156735-4d33-43ad-9a66-faf4c1813a90-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.909411 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.909384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vgt\" (UniqueName: \"kubernetes.io/projected/0b156735-4d33-43ad-9a66-faf4c1813a90-kube-api-access-z7vgt\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-jwx49\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.962450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.962407 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:05.979066 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.979012 2570 generic.go:358] "Generic (PLEG): container finished" podID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerID="57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c" exitCode=2 Apr 24 22:24:05.979066 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:05.979056 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerDied","Data":"57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c"} Apr 24 22:24:06.088785 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:06.088615 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49"] Apr 24 22:24:06.091359 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:24:06.091329 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b156735_4d33_43ad_9a66_faf4c1813a90.slice/crio-5432d00d417f2446bec2108bc1dd3ecfce5e26f7a5ef14744b6a64073e4e3a18 WatchSource:0}: Error finding container 5432d00d417f2446bec2108bc1dd3ecfce5e26f7a5ef14744b6a64073e4e3a18: Status 404 returned error can't find the container with id 5432d00d417f2446bec2108bc1dd3ecfce5e26f7a5ef14744b6a64073e4e3a18 Apr 24 22:24:06.828700 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:06.828658 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 24 22:24:06.983623 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:06.983584 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerStarted","Data":"5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a"} Apr 24 22:24:06.983623 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:06.983626 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerStarted","Data":"5432d00d417f2446bec2108bc1dd3ecfce5e26f7a5ef14744b6a64073e4e3a18"} Apr 24 22:24:07.873223 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:07.873172 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.59:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 22:24:09.993156 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:09.993121 2570 generic.go:358] "Generic (PLEG): container finished" podID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerID="5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a" exitCode=0 Apr 24 22:24:09.993584 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:09.993202 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerDied","Data":"5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a"} Apr 24 22:24:10.998789 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:10.998707 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerStarted","Data":"fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee"} Apr 24 22:24:10.998789 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:10.998751 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerStarted","Data":"6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e"} Apr 24 22:24:10.999255 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:10.998960 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:11.020627 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:11.020555 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podStartSLOduration=6.020533523 podStartE2EDuration="6.020533523s" podCreationTimestamp="2026-04-24 22:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:24:11.018777077 +0000 UTC m=+3403.442284497" watchObservedRunningTime="2026-04-24 22:24:11.020533523 +0000 UTC m=+3403.444040937" Apr 24 22:24:11.827977 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:11.827938 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.59:8643/healthz\": dial tcp 10.133.0.59:8643: connect: connection refused" Apr 24 22:24:12.002098 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:12.002066 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:12.003459 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:12.003430 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:24:13.004613 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.004570 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:24:13.320360 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.320334 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:24:13.357641 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.357613 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtnhp\" (UniqueName: \"kubernetes.io/projected/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kube-api-access-gtnhp\") pod \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " Apr 24 22:24:13.357808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.357651 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kserve-provision-location\") pod \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " Apr 24 22:24:13.357808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.357729 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-proxy-tls\") pod \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " Apr 24 22:24:13.357808 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.357785 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\" (UID: \"ca63ae87-4568-4b94-8f99-cf09ee0f52b5\") " Apr 24 22:24:13.358145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.358115 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca63ae87-4568-4b94-8f99-cf09ee0f52b5" (UID: "ca63ae87-4568-4b94-8f99-cf09ee0f52b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:24:13.358266 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.358183 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "ca63ae87-4568-4b94-8f99-cf09ee0f52b5" (UID: "ca63ae87-4568-4b94-8f99-cf09ee0f52b5"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:24:13.360447 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.360425 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ca63ae87-4568-4b94-8f99-cf09ee0f52b5" (UID: "ca63ae87-4568-4b94-8f99-cf09ee0f52b5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:24:13.360529 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.360442 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kube-api-access-gtnhp" (OuterVolumeSpecName: "kube-api-access-gtnhp") pod "ca63ae87-4568-4b94-8f99-cf09ee0f52b5" (UID: "ca63ae87-4568-4b94-8f99-cf09ee0f52b5"). InnerVolumeSpecName "kube-api-access-gtnhp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:24:13.459282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.459247 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:24:13.459282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.459277 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:24:13.459282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.459288 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:24:13.459512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:13.459299 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtnhp\" (UniqueName: \"kubernetes.io/projected/ca63ae87-4568-4b94-8f99-cf09ee0f52b5-kube-api-access-gtnhp\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:24:14.008404 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.008368 2570 generic.go:358] "Generic (PLEG): container finished" podID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerID="e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58" exitCode=0 Apr 24 22:24:14.008827 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.008439 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerDied","Data":"e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58"} Apr 24 22:24:14.008827 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.008468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" event={"ID":"ca63ae87-4568-4b94-8f99-cf09ee0f52b5","Type":"ContainerDied","Data":"b13aa3a0ab61ca6ed094845dd1432c7ff89f87b167d0c097cac50ee13414fa81"} Apr 24 22:24:14.008827 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.008467 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t" Apr 24 22:24:14.008827 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.008480 2570 scope.go:117] "RemoveContainer" containerID="57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c" Apr 24 22:24:14.016588 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.016533 2570 scope.go:117] "RemoveContainer" containerID="e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58" Apr 24 22:24:14.024521 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.024502 2570 scope.go:117] "RemoveContainer" containerID="2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0" Apr 24 22:24:14.030414 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.030390 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t"] Apr 24 22:24:14.032394 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.032378 2570 scope.go:117] "RemoveContainer" containerID="57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c" Apr 24 22:24:14.032677 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:24:14.032658 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c\": container with ID starting with 57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c not found: ID does not exist" containerID="57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c" Apr 24 22:24:14.032732 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.032687 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c"} err="failed to get container status \"57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c\": rpc error: code = NotFound desc = could not find container \"57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c\": container with ID starting with 57df47df88670c14d0f80afabc0a6a179ed5784d4b385837b27814fab9f2c01c not found: ID does not exist" Apr 24 22:24:14.032732 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.032706 2570 scope.go:117] "RemoveContainer" containerID="e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58" Apr 24 22:24:14.032947 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:24:14.032927 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58\": container with ID starting with e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58 not found: ID does not exist" containerID="e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58" Apr 24 22:24:14.033013 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.032958 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58"} err="failed to get container status \"e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58\": rpc error: code = NotFound desc = could not find container \"e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58\": container with ID starting with e537b9d230b2e53cd54b062b4ec11acffabfbcf4087c501e9ea5d8473159ff58 not found: ID does not exist" Apr 24 22:24:14.033013 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.032981 2570 scope.go:117] "RemoveContainer" containerID="2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0" Apr 24 22:24:14.033317 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:24:14.033298 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0\": container with ID starting with 2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0 not found: ID does not exist" containerID="2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0" Apr 24 22:24:14.033370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.033323 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0"} err="failed to get container status \"2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0\": rpc error: code = NotFound desc = could not find container \"2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0\": container with ID starting with 2a376967c2c154d61bb21a667a2a49e97438f22ed74ef60082e8e2824b168ae0 not found: ID does not exist" Apr 24 22:24:14.036339 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.036317 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-8lw8t"] Apr 24 22:24:14.067458 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:14.067433 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" path="/var/lib/kubelet/pods/ca63ae87-4568-4b94-8f99-cf09ee0f52b5/volumes" Apr 24 22:24:18.009099 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:18.009068 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:24:18.009653 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:18.009629 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:24:28.009634 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:28.009594 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:24:38.010274 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:38.010235 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:24:48.009617 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:48.009579 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:24:58.009918 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:24:58.009875 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:25:08.010521 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:08.010477 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:25:18.010601 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:18.010573 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:25:25.775624 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.775591 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49"] Apr 24 22:25:25.776068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.775943 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" containerID="cri-o://6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e" gracePeriod=30 Apr 24 22:25:25.776068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.776010 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kube-rbac-proxy" containerID="cri-o://fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee" gracePeriod=30 Apr 24 22:25:25.859511 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859470 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8"] Apr 24 22:25:25.859762 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859749 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kserve-container" Apr 24 22:25:25.859817 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859763 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kserve-container" Apr 24 22:25:25.859817 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859775 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="storage-initializer" Apr 24 22:25:25.859817 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859781 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="storage-initializer" Apr 24 22:25:25.859817 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859796 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kube-rbac-proxy" Apr 24 22:25:25.859817 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859802 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kube-rbac-proxy" Apr 24 22:25:25.859982 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859850 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kserve-container" Apr 24 22:25:25.859982 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.859857 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca63ae87-4568-4b94-8f99-cf09ee0f52b5" containerName="kube-rbac-proxy" Apr 24 22:25:25.862855 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.862839 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:25.865654 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.865613 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 24 22:25:25.865654 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.865631 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 24 22:25:25.865831 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.865695 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 22:25:25.877780 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.877756 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8"] Apr 24 22:25:25.917145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.917116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2502446c-803a-4176-a7fd-565b6bdf0371-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:25.917310 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.917179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zb5\" (UniqueName: \"kubernetes.io/projected/2502446c-803a-4176-a7fd-565b6bdf0371-kube-api-access-n5zb5\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:25.917310 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.917222 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2502446c-803a-4176-a7fd-565b6bdf0371-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:25.917310 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:25.917248 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2502446c-803a-4176-a7fd-565b6bdf0371-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.017972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.017939 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2502446c-803a-4176-a7fd-565b6bdf0371-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.018260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.017997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2502446c-803a-4176-a7fd-565b6bdf0371-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.018260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.018060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zb5\" (UniqueName: \"kubernetes.io/projected/2502446c-803a-4176-a7fd-565b6bdf0371-kube-api-access-n5zb5\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.018260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.018079 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2502446c-803a-4176-a7fd-565b6bdf0371-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.018465 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.018447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2502446c-803a-4176-a7fd-565b6bdf0371-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.018652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.018630 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2502446c-803a-4176-a7fd-565b6bdf0371-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.020680 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.020657 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2502446c-803a-4176-a7fd-565b6bdf0371-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.027179 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.027113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zb5\" (UniqueName: \"kubernetes.io/projected/2502446c-803a-4176-a7fd-565b6bdf0371-kube-api-access-n5zb5\") pod \"isvc-sklearn-s3-predictor-5455b66b9c-g9rw8\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.173784 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.173751 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:26.205082 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.205053 2570 generic.go:358] "Generic (PLEG): container finished" podID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerID="fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee" exitCode=2 Apr 24 22:25:26.205249 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.205112 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerDied","Data":"fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee"} Apr 24 22:25:26.303300 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.303277 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8"] Apr 24 22:25:26.305912 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:25:26.305882 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2502446c_803a_4176_a7fd_565b6bdf0371.slice/crio-7e38e982d1b29cf94546b63365b80f2c5d6bbbbe3a2fe06b707c6bba34c3ca56 WatchSource:0}: Error finding container 7e38e982d1b29cf94546b63365b80f2c5d6bbbbe3a2fe06b707c6bba34c3ca56: Status 404 returned error can't find the container with id 7e38e982d1b29cf94546b63365b80f2c5d6bbbbe3a2fe06b707c6bba34c3ca56 Apr 24 22:25:26.307739 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:26.307723 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:25:27.210825 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:27.210779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerStarted","Data":"ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c"} Apr 24 22:25:27.210825 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:27.210820 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerStarted","Data":"7e38e982d1b29cf94546b63365b80f2c5d6bbbbe3a2fe06b707c6bba34c3ca56"} Apr 24 22:25:28.005577 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:28.005530 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.60:8643/healthz\": dial tcp 10.133.0.60:8643: connect: connection refused" Apr 24 22:25:28.009888 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:28.009864 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 24 22:25:28.215974 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:28.215934 2570 generic.go:358] "Generic (PLEG): container finished" podID="2502446c-803a-4176-a7fd-565b6bdf0371" containerID="ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c" exitCode=0 Apr 24 22:25:28.216469 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:28.216013 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerDied","Data":"ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c"} Apr 24 22:25:29.222488 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.222455 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerStarted","Data":"f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118"} Apr 24 22:25:29.222488 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.222492 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerStarted","Data":"d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071"} Apr 24 22:25:29.222970 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.222668 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:29.222970 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.222688 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:29.224187 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.224153 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:25:29.245369 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.245319 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podStartSLOduration=4.245306421 podStartE2EDuration="4.245306421s" podCreationTimestamp="2026-04-24 22:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:25:29.243898466 +0000 UTC m=+3481.667405869" watchObservedRunningTime="2026-04-24 22:25:29.245306421 +0000 UTC m=+3481.668813827" Apr 24 22:25:29.621500 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.621479 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:25:29.644983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.644956 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b156735-4d33-43ad-9a66-faf4c1813a90-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"0b156735-4d33-43ad-9a66-faf4c1813a90\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " Apr 24 22:25:29.645133 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.644995 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7vgt\" (UniqueName: \"kubernetes.io/projected/0b156735-4d33-43ad-9a66-faf4c1813a90-kube-api-access-z7vgt\") pod \"0b156735-4d33-43ad-9a66-faf4c1813a90\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " Apr 24 22:25:29.645133 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.645086 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b156735-4d33-43ad-9a66-faf4c1813a90-kserve-provision-location\") pod \"0b156735-4d33-43ad-9a66-faf4c1813a90\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " Apr 24 22:25:29.645260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.645131 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b156735-4d33-43ad-9a66-faf4c1813a90-proxy-tls\") pod \"0b156735-4d33-43ad-9a66-faf4c1813a90\" (UID: \"0b156735-4d33-43ad-9a66-faf4c1813a90\") " Apr 24 22:25:29.645388 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.645366 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b156735-4d33-43ad-9a66-faf4c1813a90-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "0b156735-4d33-43ad-9a66-faf4c1813a90" (UID: "0b156735-4d33-43ad-9a66-faf4c1813a90"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:25:29.645516 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.645437 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b156735-4d33-43ad-9a66-faf4c1813a90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b156735-4d33-43ad-9a66-faf4c1813a90" (UID: "0b156735-4d33-43ad-9a66-faf4c1813a90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:25:29.647364 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.647336 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b156735-4d33-43ad-9a66-faf4c1813a90-kube-api-access-z7vgt" (OuterVolumeSpecName: "kube-api-access-z7vgt") pod "0b156735-4d33-43ad-9a66-faf4c1813a90" (UID: "0b156735-4d33-43ad-9a66-faf4c1813a90"). InnerVolumeSpecName "kube-api-access-z7vgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:25:29.647482 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.647453 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b156735-4d33-43ad-9a66-faf4c1813a90-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b156735-4d33-43ad-9a66-faf4c1813a90" (UID: "0b156735-4d33-43ad-9a66-faf4c1813a90"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:25:29.746316 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.746228 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b156735-4d33-43ad-9a66-faf4c1813a90-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:25:29.746316 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.746265 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b156735-4d33-43ad-9a66-faf4c1813a90-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:25:29.746316 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.746275 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b156735-4d33-43ad-9a66-faf4c1813a90-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:25:29.746316 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:29.746284 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7vgt\" (UniqueName: \"kubernetes.io/projected/0b156735-4d33-43ad-9a66-faf4c1813a90-kube-api-access-z7vgt\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:25:30.227345 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.227309 2570 generic.go:358] "Generic (PLEG): container finished" podID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerID="6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e" exitCode=0 Apr 24 22:25:30.227828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.227387 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" Apr 24 22:25:30.227828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.227432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerDied","Data":"6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e"} Apr 24 22:25:30.227828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.227466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49" event={"ID":"0b156735-4d33-43ad-9a66-faf4c1813a90","Type":"ContainerDied","Data":"5432d00d417f2446bec2108bc1dd3ecfce5e26f7a5ef14744b6a64073e4e3a18"} Apr 24 22:25:30.227828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.227486 2570 scope.go:117] "RemoveContainer" containerID="fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee" Apr 24 22:25:30.228095 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.227885 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:25:30.240474 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.240449 2570 scope.go:117] "RemoveContainer" containerID="6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e" Apr 24 22:25:30.246757 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.246734 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49"] Apr 24 22:25:30.248095 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.248075 2570 scope.go:117] "RemoveContainer" containerID="5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a" Apr 24 22:25:30.250448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.250428 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-jwx49"] Apr 24 22:25:30.255654 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.255639 2570 scope.go:117] "RemoveContainer" containerID="fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee" Apr 24 22:25:30.255895 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:25:30.255878 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee\": container with ID starting with fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee not found: ID does not exist" containerID="fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee" Apr 24 22:25:30.255964 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.255900 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee"} err="failed to get container status \"fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee\": rpc error: code = NotFound desc = could not find container \"fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee\": container with ID starting with fe1a94878799e00cf0275f454c228b8094404d9175445150903869d2f29f7aee not found: ID does not exist" Apr 24 22:25:30.255964 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.255916 2570 scope.go:117] "RemoveContainer" containerID="6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e" Apr 24 22:25:30.256208 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:25:30.256191 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e\": container with ID starting with 6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e not found: ID does not exist" containerID="6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e" Apr 24 22:25:30.256256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.256215 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e"} err="failed to get container status \"6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e\": rpc error: code = NotFound desc = could not find container \"6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e\": container with ID starting with 6b79b3e2bc0eaf11c9ea6819364ebeedbfc898637030a758fc06485caf27977e not found: ID does not exist" Apr 24 22:25:30.256256 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.256235 2570 scope.go:117] "RemoveContainer" containerID="5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a" Apr 24 22:25:30.256477 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:25:30.256460 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a\": container with ID starting with 5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a not found: ID does not exist" containerID="5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a" Apr 24 22:25:30.256527 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:30.256483 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a"} err="failed to get container status \"5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a\": rpc error: code = NotFound desc = could not find container \"5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a\": container with ID starting with 5ded870e9f09fba2de2873f11a9e0337cf1d6f6e0a1fc4050f0e8f07d1e5999a not found: ID does not exist" Apr 24 22:25:32.069355 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:32.069324 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" path="/var/lib/kubelet/pods/0b156735-4d33-43ad-9a66-faf4c1813a90/volumes" Apr 24 22:25:35.232450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:35.232425 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:25:35.233068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:35.233017 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:25:45.233807 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:45.233769 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:25:55.233966 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:25:55.233930 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:26:05.233949 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:05.233914 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:26:15.233578 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:15.233537 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:26:25.233308 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:25.233264 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 24 22:26:35.234091 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:35.234057 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:26:35.932983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:35.932939 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8"] Apr 24 22:26:35.933292 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:35.933264 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" containerID="cri-o://d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071" gracePeriod=30 Apr 24 22:26:35.933384 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:35.933300 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kube-rbac-proxy" containerID="cri-o://f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118" gracePeriod=30 Apr 24 22:26:36.053374 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053341 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz"] Apr 24 22:26:36.053617 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053606 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="storage-initializer" Apr 24 22:26:36.053670 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053619 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="storage-initializer" Apr 24 22:26:36.053670 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053634 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kube-rbac-proxy" Apr 24 22:26:36.053670 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053640 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kube-rbac-proxy" Apr 24 22:26:36.053670 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053652 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" Apr 24 22:26:36.053670 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053657 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" Apr 24 22:26:36.053845 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053707 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kserve-container" Apr 24 22:26:36.053845 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.053716 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b156735-4d33-43ad-9a66-faf4c1813a90" containerName="kube-rbac-proxy" Apr 24 22:26:36.056584 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.056567 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.058948 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.058922 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 24 22:26:36.059087 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.058992 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:26:36.059087 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.059002 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:26:36.070760 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.070734 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz"] Apr 24 22:26:36.118780 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.118736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.118970 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.118790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.118970 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.118890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.118970 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.118928 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkh4s\" (UniqueName: \"kubernetes.io/projected/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kube-api-access-mkh4s\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.118970 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.118959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.220251 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220140 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.220251 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.220510 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220279 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.220510 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.220510 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkh4s\" (UniqueName: \"kubernetes.io/projected/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kube-api-access-mkh4s\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.220510 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:26:36.220463 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 24 22:26:36.220726 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:26:36.220549 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls podName:e37d4074-c1ee-4f83-9e7b-585b73c36c6d nodeName:}" failed. No retries permitted until 2026-04-24 22:26:36.720527758 +0000 UTC m=+3549.144035159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls") pod "isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" (UID: "e37d4074-c1ee-4f83-9e7b-585b73c36c6d") : secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 24 22:26:36.220885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.221008 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.220990 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.221098 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.221002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.232409 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.232371 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkh4s\" (UniqueName: \"kubernetes.io/projected/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kube-api-access-mkh4s\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.419191 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.419153 2570 generic.go:358] "Generic (PLEG): container finished" podID="2502446c-803a-4176-a7fd-565b6bdf0371" containerID="f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118" exitCode=2 Apr 24 22:26:36.419573 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.419231 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerDied","Data":"f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118"} Apr 24 22:26:36.725207 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.725149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.727837 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.727798 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:36.968563 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:36.968522 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:37.095012 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:37.094980 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz"] Apr 24 22:26:37.098776 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:26:37.098744 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37d4074_c1ee_4f83_9e7b_585b73c36c6d.slice/crio-68e0b1421397015323f7eb18c79379df80bb92cb319888f64faa841f652a27f1 WatchSource:0}: Error finding container 68e0b1421397015323f7eb18c79379df80bb92cb319888f64faa841f652a27f1: Status 404 returned error can't find the container with id 68e0b1421397015323f7eb18c79379df80bb92cb319888f64faa841f652a27f1 Apr 24 22:26:37.424809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:37.424772 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerStarted","Data":"8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47"} Apr 24 22:26:37.424809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:37.424812 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerStarted","Data":"68e0b1421397015323f7eb18c79379df80bb92cb319888f64faa841f652a27f1"} Apr 24 22:26:38.429448 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:38.429417 2570 generic.go:358] "Generic (PLEG): container finished" podID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerID="8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47" exitCode=0 Apr 24 22:26:38.429831 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:38.429507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerDied","Data":"8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47"} Apr 24 22:26:39.434703 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:39.434667 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerStarted","Data":"6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7"} Apr 24 22:26:39.434703 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:39.434702 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerStarted","Data":"bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d"} Apr 24 22:26:39.435260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:39.434877 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:39.435260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:39.434904 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:39.436168 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:39.436142 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:26:39.457436 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:39.457378 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podStartSLOduration=3.457360444 podStartE2EDuration="3.457360444s" podCreationTimestamp="2026-04-24 22:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:26:39.455290673 +0000 UTC m=+3551.878798081" watchObservedRunningTime="2026-04-24 22:26:39.457360444 +0000 UTC m=+3551.880867852" Apr 24 22:26:40.229255 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.229205 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.61:8643/healthz\": dial tcp 10.133.0.61:8643: connect: connection refused" Apr 24 22:26:40.437397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.437357 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:26:40.669459 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.669436 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:26:40.760312 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.760268 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2502446c-803a-4176-a7fd-565b6bdf0371-proxy-tls\") pod \"2502446c-803a-4176-a7fd-565b6bdf0371\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " Apr 24 22:26:40.760616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.760360 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2502446c-803a-4176-a7fd-565b6bdf0371-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"2502446c-803a-4176-a7fd-565b6bdf0371\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " Apr 24 22:26:40.760616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.760404 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2502446c-803a-4176-a7fd-565b6bdf0371-kserve-provision-location\") pod \"2502446c-803a-4176-a7fd-565b6bdf0371\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " Apr 24 22:26:40.760616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.760440 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zb5\" (UniqueName: \"kubernetes.io/projected/2502446c-803a-4176-a7fd-565b6bdf0371-kube-api-access-n5zb5\") pod \"2502446c-803a-4176-a7fd-565b6bdf0371\" (UID: \"2502446c-803a-4176-a7fd-565b6bdf0371\") " Apr 24 22:26:40.760746 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.760720 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2502446c-803a-4176-a7fd-565b6bdf0371-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2502446c-803a-4176-a7fd-565b6bdf0371" (UID: "2502446c-803a-4176-a7fd-565b6bdf0371"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:26:40.760785 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.760750 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2502446c-803a-4176-a7fd-565b6bdf0371-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "2502446c-803a-4176-a7fd-565b6bdf0371" (UID: "2502446c-803a-4176-a7fd-565b6bdf0371"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:26:40.762668 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.762639 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2502446c-803a-4176-a7fd-565b6bdf0371-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2502446c-803a-4176-a7fd-565b6bdf0371" (UID: "2502446c-803a-4176-a7fd-565b6bdf0371"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:26:40.762668 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.762646 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2502446c-803a-4176-a7fd-565b6bdf0371-kube-api-access-n5zb5" (OuterVolumeSpecName: "kube-api-access-n5zb5") pod "2502446c-803a-4176-a7fd-565b6bdf0371" (UID: "2502446c-803a-4176-a7fd-565b6bdf0371"). InnerVolumeSpecName "kube-api-access-n5zb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:26:40.861228 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.861126 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2502446c-803a-4176-a7fd-565b6bdf0371-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:26:40.861228 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.861162 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2502446c-803a-4176-a7fd-565b6bdf0371-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:26:40.861228 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.861177 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2502446c-803a-4176-a7fd-565b6bdf0371-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:26:40.861228 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:40.861201 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n5zb5\" (UniqueName: \"kubernetes.io/projected/2502446c-803a-4176-a7fd-565b6bdf0371-kube-api-access-n5zb5\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:26:41.446869 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.446828 2570 generic.go:358] "Generic (PLEG): container finished" podID="2502446c-803a-4176-a7fd-565b6bdf0371" containerID="d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071" exitCode=0 Apr 24 22:26:41.447336 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.446889 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerDied","Data":"d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071"} Apr 24 22:26:41.447336 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.446925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" event={"ID":"2502446c-803a-4176-a7fd-565b6bdf0371","Type":"ContainerDied","Data":"7e38e982d1b29cf94546b63365b80f2c5d6bbbbe3a2fe06b707c6bba34c3ca56"} Apr 24 22:26:41.447336 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.446930 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8" Apr 24 22:26:41.447336 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.446943 2570 scope.go:117] "RemoveContainer" containerID="f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118" Apr 24 22:26:41.455905 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.455882 2570 scope.go:117] "RemoveContainer" containerID="d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071" Apr 24 22:26:41.464399 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.464376 2570 scope.go:117] "RemoveContainer" containerID="ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c" Apr 24 22:26:41.472720 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.472700 2570 scope.go:117] "RemoveContainer" containerID="f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118" Apr 24 22:26:41.473011 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:26:41.472988 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118\": container with ID starting with f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118 not found: ID does not exist" containerID="f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118" Apr 24 22:26:41.473112 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.473042 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118"} err="failed to get container status \"f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118\": rpc error: code = NotFound desc = could not find container \"f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118\": container with ID starting with f25c92ad899cede7ef08e2f0828349a57e301913aba00705c60765512c487118 not found: ID does not exist" Apr 24 22:26:41.473112 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.473070 2570 scope.go:117] "RemoveContainer" containerID="d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071" Apr 24 22:26:41.473348 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:26:41.473331 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071\": container with ID starting with d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071 not found: ID does not exist" containerID="d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071" Apr 24 22:26:41.473390 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.473356 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071"} err="failed to get container status \"d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071\": rpc error: code = NotFound desc = could not find container \"d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071\": container with ID starting with d1237945949860850242f053dbd2cc42cc8e9e28b1112901a5313cc4b239e071 not found: ID does not exist" Apr 24 22:26:41.473390 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.473373 2570 scope.go:117] "RemoveContainer" containerID="ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c" Apr 24 22:26:41.473656 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:26:41.473635 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c\": container with ID starting with ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c not found: ID does not exist" containerID="ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c" Apr 24 22:26:41.473705 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.473662 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c"} err="failed to get container status \"ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c\": rpc error: code = NotFound desc = could not find container \"ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c\": container with ID starting with ecb916b5f121106cc0570c3ec08d905a85b743d9aacd8e21dfb183caffad762c not found: ID does not exist" Apr 24 22:26:41.488768 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.488731 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8"] Apr 24 22:26:41.494560 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:41.494526 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5455b66b9c-g9rw8"] Apr 24 22:26:42.067109 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:42.067077 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" path="/var/lib/kubelet/pods/2502446c-803a-4176-a7fd-565b6bdf0371/volumes" Apr 24 22:26:45.441481 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:45.441452 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:26:45.441984 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:45.441959 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:26:55.442466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:26:55.442426 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:27:05.442145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:05.442109 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:27:15.442468 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:15.442377 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:27:25.442085 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:25.442017 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:27:35.441918 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:35.441878 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 24 22:27:45.442184 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:45.442154 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:27:46.117812 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:46.117783 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz"] Apr 24 22:27:46.118129 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:46.118104 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kube-rbac-proxy" containerID="cri-o://6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7" gracePeriod=30 Apr 24 22:27:46.118219 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:46.118083 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" containerID="cri-o://bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d" gracePeriod=30 Apr 24 22:27:46.642286 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:46.642251 2570 generic.go:358] "Generic (PLEG): container finished" podID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerID="6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7" exitCode=2 Apr 24 22:27:46.642286 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:46.642286 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerDied","Data":"6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7"} Apr 24 22:27:47.265261 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265222 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm"] Apr 24 22:27:47.265588 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265571 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="storage-initializer" Apr 24 22:27:47.265678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265591 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="storage-initializer" Apr 24 22:27:47.265678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265602 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kube-rbac-proxy" Apr 24 22:27:47.265678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265610 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kube-rbac-proxy" Apr 24 22:27:47.265678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265625 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" Apr 24 22:27:47.265678 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265634 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" Apr 24 22:27:47.265933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265715 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kserve-container" Apr 24 22:27:47.265933 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.265727 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2502446c-803a-4176-a7fd-565b6bdf0371" containerName="kube-rbac-proxy" Apr 24 22:27:47.268799 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.268779 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.274935 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.274911 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 24 22:27:47.275716 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.275696 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:27:47.282831 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.282808 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm"] Apr 24 22:27:47.369452 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.369412 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf7add-90b3-43f2-ad17-a680319d09e3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.369452 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.369451 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qgbx\" (UniqueName: \"kubernetes.io/projected/dabf7add-90b3-43f2-ad17-a680319d09e3-kube-api-access-4qgbx\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.369656 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.369477 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf7add-90b3-43f2-ad17-a680319d09e3-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.369656 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.369531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf7add-90b3-43f2-ad17-a680319d09e3-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.470368 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.470333 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf7add-90b3-43f2-ad17-a680319d09e3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.470368 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.470368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qgbx\" (UniqueName: \"kubernetes.io/projected/dabf7add-90b3-43f2-ad17-a680319d09e3-kube-api-access-4qgbx\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.470571 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.470392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf7add-90b3-43f2-ad17-a680319d09e3-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.470571 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.470414 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf7add-90b3-43f2-ad17-a680319d09e3-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.470770 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.470746 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf7add-90b3-43f2-ad17-a680319d09e3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.471014 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.470996 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf7add-90b3-43f2-ad17-a680319d09e3-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.472968 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.472948 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf7add-90b3-43f2-ad17-a680319d09e3-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.479174 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.479150 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qgbx\" (UniqueName: \"kubernetes.io/projected/dabf7add-90b3-43f2-ad17-a680319d09e3-kube-api-access-4qgbx\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.578844 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.578756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:47.701604 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:47.701579 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm"] Apr 24 22:27:47.704260 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:27:47.704231 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddabf7add_90b3_43f2_ad17_a680319d09e3.slice/crio-be152388496fb46c73b615b46276ddeb6fd6245cf9a57707ab2f8d59b94d714e WatchSource:0}: Error finding container be152388496fb46c73b615b46276ddeb6fd6245cf9a57707ab2f8d59b94d714e: Status 404 returned error can't find the container with id be152388496fb46c73b615b46276ddeb6fd6245cf9a57707ab2f8d59b94d714e Apr 24 22:27:48.649567 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:48.649526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" event={"ID":"dabf7add-90b3-43f2-ad17-a680319d09e3","Type":"ContainerStarted","Data":"04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce"} Apr 24 22:27:48.649567 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:48.649569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" event={"ID":"dabf7add-90b3-43f2-ad17-a680319d09e3","Type":"ContainerStarted","Data":"be152388496fb46c73b615b46276ddeb6fd6245cf9a57707ab2f8d59b94d714e"} Apr 24 22:27:50.163520 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.163498 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:27:50.188908 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.188885 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " Apr 24 22:27:50.189056 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.188927 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkh4s\" (UniqueName: \"kubernetes.io/projected/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kube-api-access-mkh4s\") pod \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " Apr 24 22:27:50.189056 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.188950 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-cabundle-cert\") pod \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " Apr 24 22:27:50.189056 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.188974 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls\") pod \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " Apr 24 22:27:50.189056 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.189036 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kserve-provision-location\") pod \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\" (UID: \"e37d4074-c1ee-4f83-9e7b-585b73c36c6d\") " Apr 24 22:27:50.189403 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.189375 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "e37d4074-c1ee-4f83-9e7b-585b73c36c6d" (UID: "e37d4074-c1ee-4f83-9e7b-585b73c36c6d"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:27:50.189403 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.189388 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "e37d4074-c1ee-4f83-9e7b-585b73c36c6d" (UID: "e37d4074-c1ee-4f83-9e7b-585b73c36c6d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:27:50.189591 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.189382 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e37d4074-c1ee-4f83-9e7b-585b73c36c6d" (UID: "e37d4074-c1ee-4f83-9e7b-585b73c36c6d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:27:50.191273 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.191242 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e37d4074-c1ee-4f83-9e7b-585b73c36c6d" (UID: "e37d4074-c1ee-4f83-9e7b-585b73c36c6d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:27:50.191341 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.191294 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kube-api-access-mkh4s" (OuterVolumeSpecName: "kube-api-access-mkh4s") pod "e37d4074-c1ee-4f83-9e7b-585b73c36c6d" (UID: "e37d4074-c1ee-4f83-9e7b-585b73c36c6d"). InnerVolumeSpecName "kube-api-access-mkh4s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:27:50.289972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.289908 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:50.289972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.289933 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:50.289972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.289944 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkh4s\" (UniqueName: \"kubernetes.io/projected/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-kube-api-access-mkh4s\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:50.289972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.289955 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-cabundle-cert\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:50.289972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.289964 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e37d4074-c1ee-4f83-9e7b-585b73c36c6d-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:50.657375 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.657293 2570 generic.go:358] "Generic (PLEG): container finished" podID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerID="bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d" exitCode=0 Apr 24 22:27:50.657512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.657375 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" Apr 24 22:27:50.657512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.657379 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerDied","Data":"bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d"} Apr 24 22:27:50.657512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.657418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz" event={"ID":"e37d4074-c1ee-4f83-9e7b-585b73c36c6d","Type":"ContainerDied","Data":"68e0b1421397015323f7eb18c79379df80bb92cb319888f64faa841f652a27f1"} Apr 24 22:27:50.657512 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.657433 2570 scope.go:117] "RemoveContainer" containerID="6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7" Apr 24 22:27:50.665534 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.665517 2570 scope.go:117] "RemoveContainer" containerID="bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d" Apr 24 22:27:50.672440 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.672419 2570 scope.go:117] "RemoveContainer" containerID="8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47" Apr 24 22:27:50.678840 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.678819 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz"] Apr 24 22:27:50.679720 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.679706 2570 scope.go:117] "RemoveContainer" containerID="6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7" Apr 24 22:27:50.679982 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:50.679962 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7\": container with ID starting with 6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7 not found: ID does not exist" containerID="6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7" Apr 24 22:27:50.680076 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.679989 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7"} err="failed to get container status \"6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7\": rpc error: code = NotFound desc = could not find container \"6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7\": container with ID starting with 6ca54ac642b97c455bb18ecc4e60c63e464b82c3df5b8cb56c39491ed5a93fa7 not found: ID does not exist" Apr 24 22:27:50.680076 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.680007 2570 scope.go:117] "RemoveContainer" containerID="bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d" Apr 24 22:27:50.680498 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:50.680467 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d\": container with ID starting with bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d not found: ID does not exist" containerID="bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d" Apr 24 22:27:50.680603 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.680508 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d"} err="failed to get container status \"bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d\": rpc error: code = NotFound desc = could not find container \"bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d\": container with ID starting with bbaa20ea165ff92c623662aedeab041c99fbea1bee883752869999196519737d not found: ID does not exist" Apr 24 22:27:50.680603 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.680559 2570 scope.go:117] "RemoveContainer" containerID="8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47" Apr 24 22:27:50.680953 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:50.680923 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47\": container with ID starting with 8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47 not found: ID does not exist" containerID="8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47" Apr 24 22:27:50.681070 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.680962 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47"} err="failed to get container status \"8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47\": rpc error: code = NotFound desc = could not find container \"8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47\": container with ID starting with 8a43725e929974fc81bd0d251ddace03136b1e024bb989a8b76b4fbb02843c47 not found: ID does not exist" Apr 24 22:27:50.682135 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:50.682119 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-576cdf7dbf-fnrjz"] Apr 24 22:27:52.066710 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:52.066675 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" path="/var/lib/kubelet/pods/e37d4074-c1ee-4f83-9e7b-585b73c36c6d/volumes" Apr 24 22:27:52.665226 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:52.665189 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm_dabf7add-90b3-43f2-ad17-a680319d09e3/storage-initializer/0.log" Apr 24 22:27:52.665376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:52.665235 2570 generic.go:358] "Generic (PLEG): container finished" podID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerID="04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce" exitCode=1 Apr 24 22:27:52.665376 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:52.665317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" event={"ID":"dabf7add-90b3-43f2-ad17-a680319d09e3","Type":"ContainerDied","Data":"04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce"} Apr 24 22:27:53.669751 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:53.669725 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm_dabf7add-90b3-43f2-ad17-a680319d09e3/storage-initializer/0.log" Apr 24 22:27:53.670178 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:53.669811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" event={"ID":"dabf7add-90b3-43f2-ad17-a680319d09e3","Type":"ContainerStarted","Data":"8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2"} Apr 24 22:27:57.187113 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.187072 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm"] Apr 24 22:27:57.187489 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.187392 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" containerID="cri-o://8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2" gracePeriod=30 Apr 24 22:27:57.623018 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.622995 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm_dabf7add-90b3-43f2-ad17-a680319d09e3/storage-initializer/1.log" Apr 24 22:27:57.623412 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.623396 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm_dabf7add-90b3-43f2-ad17-a680319d09e3/storage-initializer/0.log" Apr 24 22:27:57.623483 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.623463 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:57.685560 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.685524 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm_dabf7add-90b3-43f2-ad17-a680319d09e3/storage-initializer/1.log" Apr 24 22:27:57.686140 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.686118 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm_dabf7add-90b3-43f2-ad17-a680319d09e3/storage-initializer/0.log" Apr 24 22:27:57.686217 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.686178 2570 generic.go:358] "Generic (PLEG): container finished" podID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerID="8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2" exitCode=1 Apr 24 22:27:57.686363 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.686328 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" event={"ID":"dabf7add-90b3-43f2-ad17-a680319d09e3","Type":"ContainerDied","Data":"8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2"} Apr 24 22:27:57.686425 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.686393 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" event={"ID":"dabf7add-90b3-43f2-ad17-a680319d09e3","Type":"ContainerDied","Data":"be152388496fb46c73b615b46276ddeb6fd6245cf9a57707ab2f8d59b94d714e"} Apr 24 22:27:57.686425 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.686411 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm" Apr 24 22:27:57.686533 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.686419 2570 scope.go:117] "RemoveContainer" containerID="8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2" Apr 24 22:27:57.694131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.694110 2570 scope.go:117] "RemoveContainer" containerID="04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce" Apr 24 22:27:57.701159 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.701142 2570 scope.go:117] "RemoveContainer" containerID="8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2" Apr 24 22:27:57.701442 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:57.701423 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2\": container with ID starting with 8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2 not found: ID does not exist" containerID="8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2" Apr 24 22:27:57.701499 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.701451 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2"} err="failed to get container status \"8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2\": rpc error: code = NotFound desc = could not find container \"8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2\": container with ID starting with 8b479511d5aac64f0b3cb7e48141bb85e42afa95a1e2aea777c99cb95c5942a2 not found: ID does not exist" Apr 24 22:27:57.701499 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.701469 2570 scope.go:117] "RemoveContainer" containerID="04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce" Apr 24 22:27:57.701705 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:57.701688 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce\": container with ID starting with 04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce not found: ID does not exist" containerID="04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce" Apr 24 22:27:57.701751 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.701712 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce"} err="failed to get container status \"04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce\": rpc error: code = NotFound desc = could not find container \"04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce\": container with ID starting with 04ecf4de4fe7bd70f305225afd2e4c1f752e1c9988713b128e544a86867120ce not found: ID does not exist" Apr 24 22:27:57.748444 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748384 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf7add-90b3-43f2-ad17-a680319d09e3-proxy-tls\") pod \"dabf7add-90b3-43f2-ad17-a680319d09e3\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " Apr 24 22:27:57.748444 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748418 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qgbx\" (UniqueName: \"kubernetes.io/projected/dabf7add-90b3-43f2-ad17-a680319d09e3-kube-api-access-4qgbx\") pod \"dabf7add-90b3-43f2-ad17-a680319d09e3\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " Apr 24 22:27:57.748615 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748461 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf7add-90b3-43f2-ad17-a680319d09e3-kserve-provision-location\") pod \"dabf7add-90b3-43f2-ad17-a680319d09e3\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " Apr 24 22:27:57.748615 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748581 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf7add-90b3-43f2-ad17-a680319d09e3-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"dabf7add-90b3-43f2-ad17-a680319d09e3\" (UID: \"dabf7add-90b3-43f2-ad17-a680319d09e3\") " Apr 24 22:27:57.748719 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748691 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabf7add-90b3-43f2-ad17-a680319d09e3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dabf7add-90b3-43f2-ad17-a680319d09e3" (UID: "dabf7add-90b3-43f2-ad17-a680319d09e3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:27:57.748868 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748841 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dabf7add-90b3-43f2-ad17-a680319d09e3-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:57.748976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.748868 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dabf7add-90b3-43f2-ad17-a680319d09e3-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "dabf7add-90b3-43f2-ad17-a680319d09e3" (UID: "dabf7add-90b3-43f2-ad17-a680319d09e3"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:27:57.750514 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.750492 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabf7add-90b3-43f2-ad17-a680319d09e3-kube-api-access-4qgbx" (OuterVolumeSpecName: "kube-api-access-4qgbx") pod "dabf7add-90b3-43f2-ad17-a680319d09e3" (UID: "dabf7add-90b3-43f2-ad17-a680319d09e3"). InnerVolumeSpecName "kube-api-access-4qgbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:27:57.750640 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.750617 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabf7add-90b3-43f2-ad17-a680319d09e3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dabf7add-90b3-43f2-ad17-a680319d09e3" (UID: "dabf7add-90b3-43f2-ad17-a680319d09e3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:27:57.849922 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.849875 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dabf7add-90b3-43f2-ad17-a680319d09e3-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:57.849922 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.849920 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dabf7add-90b3-43f2-ad17-a680319d09e3-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:57.849922 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:57.849931 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qgbx\" (UniqueName: \"kubernetes.io/projected/dabf7add-90b3-43f2-ad17-a680319d09e3-kube-api-access-4qgbx\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:27:58.021697 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.021612 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm"] Apr 24 22:27:58.027159 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.027135 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6fbc85fd87-2whfm"] Apr 24 22:27:58.068117 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.068082 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" path="/var/lib/kubelet/pods/dabf7add-90b3-43f2-ad17-a680319d09e3/volumes" Apr 24 22:27:58.286981 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.286908 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng"] Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287174 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287185 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287199 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kube-rbac-proxy" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287206 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kube-rbac-proxy" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287214 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287220 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287232 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287237 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287274 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287281 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287290 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kserve-container" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287297 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="e37d4074-c1ee-4f83-9e7b-585b73c36c6d" containerName="kube-rbac-proxy" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287340 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" Apr 24 22:27:58.287343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.287346 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf7add-90b3-43f2-ad17-a680319d09e3" containerName="storage-initializer" Apr 24 22:27:58.291713 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.291697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.294188 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294157 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 24 22:27:58.294327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294232 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:27:58.294327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294257 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-qlzl5\"" Apr 24 22:27:58.294327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294255 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:27:58.294327 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 22:27:58.294548 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294533 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:27:58.294594 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.294562 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:27:58.299476 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.299448 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng"] Apr 24 22:27:58.354668 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.354637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrw9\" (UniqueName: \"kubernetes.io/projected/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kube-api-access-cqrw9\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.354836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.354680 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.354836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.354728 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.354836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.354752 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.354836 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.354815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.455958 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.455926 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.455958 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.455967 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrw9\" (UniqueName: \"kubernetes.io/projected/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kube-api-access-cqrw9\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.456260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.455996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.456260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.456045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.456260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.456074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.456260 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:58.456173 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 24 22:27:58.456260 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:27:58.456245 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls podName:0a82dc86-d37e-4b02-87ac-6cd0616664c7 nodeName:}" failed. No retries permitted until 2026-04-24 22:27:58.956224334 +0000 UTC m=+3631.379731741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" (UID: "0a82dc86-d37e-4b02-87ac-6cd0616664c7") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 24 22:27:58.456556 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.456424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.456721 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.456701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.456762 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.456709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.466930 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.466908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrw9\" (UniqueName: \"kubernetes.io/projected/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kube-api-access-cqrw9\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.959181 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.959155 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:58.961553 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:58.961531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:59.203600 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:59.203568 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:27:59.324972 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:59.324943 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng"] Apr 24 22:27:59.328359 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:27:59.328327 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a82dc86_d37e_4b02_87ac_6cd0616664c7.slice/crio-2f893bcc6881b355dd18325c16d59caa1a37f98c666c2989db2bf5ac175eecc9 WatchSource:0}: Error finding container 2f893bcc6881b355dd18325c16d59caa1a37f98c666c2989db2bf5ac175eecc9: Status 404 returned error can't find the container with id 2f893bcc6881b355dd18325c16d59caa1a37f98c666c2989db2bf5ac175eecc9 Apr 24 22:27:59.694920 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:59.694883 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerStarted","Data":"34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b"} Apr 24 22:27:59.695086 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:27:59.694926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerStarted","Data":"2f893bcc6881b355dd18325c16d59caa1a37f98c666c2989db2bf5ac175eecc9"} Apr 24 22:28:00.699287 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:00.699255 2570 generic.go:358] "Generic (PLEG): container finished" podID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerID="34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b" exitCode=0 Apr 24 22:28:00.699750 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:00.699356 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerDied","Data":"34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b"} Apr 24 22:28:01.703951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:01.703919 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerStarted","Data":"f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d"} Apr 24 22:28:01.703951 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:01.703954 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerStarted","Data":"e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b"} Apr 24 22:28:01.704397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:01.704128 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:28:01.704397 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:01.704268 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:28:01.705716 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:01.705689 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:01.727536 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:01.727494 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podStartSLOduration=3.727481157 podStartE2EDuration="3.727481157s" podCreationTimestamp="2026-04-24 22:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:28:01.726639135 +0000 UTC m=+3634.150146542" watchObservedRunningTime="2026-04-24 22:28:01.727481157 +0000 UTC m=+3634.150988561" Apr 24 22:28:02.707193 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:02.707150 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:07.711079 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:07.711046 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:28:07.711562 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:07.711535 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:17.711507 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:17.711464 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:21.167319 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:21.167289 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:28:21.172487 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:21.172467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:28:27.711946 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:27.711911 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:37.712244 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:37.712208 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:47.711894 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:47.711855 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:28:57.711935 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:28:57.711892 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 24 22:29:07.712852 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:07.712822 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:29:08.278411 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:08.278350 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng"] Apr 24 22:29:08.278805 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:08.278767 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" containerID="cri-o://e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b" gracePeriod=30 Apr 24 22:29:08.278926 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:08.278798 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kube-rbac-proxy" containerID="cri-o://f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d" gracePeriod=30 Apr 24 22:29:08.890242 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:08.890197 2570 generic.go:358] "Generic (PLEG): container finished" podID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerID="f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d" exitCode=2 Apr 24 22:29:08.890616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:08.890274 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerDied","Data":"f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d"} Apr 24 22:29:09.376752 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.376713 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm"] Apr 24 22:29:09.380063 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.380043 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.382743 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.382719 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 24 22:29:09.382885 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.382772 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:29:09.391288 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.391260 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm"] Apr 24 22:29:09.467814 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.467767 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hgh\" (UniqueName: \"kubernetes.io/projected/42961337-6888-4587-8ae3-9769bbf1fba8-kube-api-access-c7hgh\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.468009 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.467833 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.468009 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.467893 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42961337-6888-4587-8ae3-9769bbf1fba8-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.468009 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.467964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42961337-6888-4587-8ae3-9769bbf1fba8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.568998 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.568961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42961337-6888-4587-8ae3-9769bbf1fba8-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.569268 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.569013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42961337-6888-4587-8ae3-9769bbf1fba8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.569268 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.569064 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hgh\" (UniqueName: \"kubernetes.io/projected/42961337-6888-4587-8ae3-9769bbf1fba8-kube-api-access-c7hgh\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.569268 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.569098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.569268 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:09.569204 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 24 22:29:09.569517 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:09.569280 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls podName:42961337-6888-4587-8ae3-9769bbf1fba8 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:10.069260172 +0000 UTC m=+3702.492767557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" (UID: "42961337-6888-4587-8ae3-9769bbf1fba8") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 24 22:29:09.569517 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.569489 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42961337-6888-4587-8ae3-9769bbf1fba8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.569766 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.569744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42961337-6888-4587-8ae3-9769bbf1fba8-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:09.580477 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:09.580440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hgh\" (UniqueName: \"kubernetes.io/projected/42961337-6888-4587-8ae3-9769bbf1fba8-kube-api-access-c7hgh\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:10.073655 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:10.073617 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:10.076277 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:10.076249 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:10.290877 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:10.290838 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:10.415169 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:10.415143 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm"] Apr 24 22:29:10.418273 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:29:10.418238 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42961337_6888_4587_8ae3_9769bbf1fba8.slice/crio-8f598df44d58827347c99964aba5ac5104fb2416dd0807304caced619a80acdc WatchSource:0}: Error finding container 8f598df44d58827347c99964aba5ac5104fb2416dd0807304caced619a80acdc: Status 404 returned error can't find the container with id 8f598df44d58827347c99964aba5ac5104fb2416dd0807304caced619a80acdc Apr 24 22:29:10.898425 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:10.898389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" event={"ID":"42961337-6888-4587-8ae3-9769bbf1fba8","Type":"ContainerStarted","Data":"63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9"} Apr 24 22:29:10.898425 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:10.898431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" event={"ID":"42961337-6888-4587-8ae3-9769bbf1fba8","Type":"ContainerStarted","Data":"8f598df44d58827347c99964aba5ac5104fb2416dd0807304caced619a80acdc"} Apr 24 22:29:12.612146 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.612126 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:29:12.692637 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.692553 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " Apr 24 22:29:12.692637 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.692600 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqrw9\" (UniqueName: \"kubernetes.io/projected/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kube-api-access-cqrw9\") pod \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " Apr 24 22:29:12.692839 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.692648 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kserve-provision-location\") pod \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " Apr 24 22:29:12.692839 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.692667 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-cabundle-cert\") pod \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " Apr 24 22:29:12.692839 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.692691 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls\") pod \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\" (UID: \"0a82dc86-d37e-4b02-87ac-6cd0616664c7\") " Apr 24 22:29:12.693085 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.693064 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "0a82dc86-d37e-4b02-87ac-6cd0616664c7" (UID: "0a82dc86-d37e-4b02-87ac-6cd0616664c7"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:29:12.693167 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.693064 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0a82dc86-d37e-4b02-87ac-6cd0616664c7" (UID: "0a82dc86-d37e-4b02-87ac-6cd0616664c7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:29:12.693167 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.693114 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "0a82dc86-d37e-4b02-87ac-6cd0616664c7" (UID: "0a82dc86-d37e-4b02-87ac-6cd0616664c7"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:29:12.694907 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.694888 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0a82dc86-d37e-4b02-87ac-6cd0616664c7" (UID: "0a82dc86-d37e-4b02-87ac-6cd0616664c7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:29:12.694976 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.694944 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kube-api-access-cqrw9" (OuterVolumeSpecName: "kube-api-access-cqrw9") pod "0a82dc86-d37e-4b02-87ac-6cd0616664c7" (UID: "0a82dc86-d37e-4b02-87ac-6cd0616664c7"). InnerVolumeSpecName "kube-api-access-cqrw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:29:12.793639 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.793610 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:12.793639 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.793635 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqrw9\" (UniqueName: \"kubernetes.io/projected/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kube-api-access-cqrw9\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:12.793639 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.793645 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a82dc86-d37e-4b02-87ac-6cd0616664c7-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:12.793856 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.793656 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0a82dc86-d37e-4b02-87ac-6cd0616664c7-cabundle-cert\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:12.793856 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.793665 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a82dc86-d37e-4b02-87ac-6cd0616664c7-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:12.905330 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.905292 2570 generic.go:358] "Generic (PLEG): container finished" podID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerID="e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b" exitCode=0 Apr 24 22:29:12.905485 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.905372 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" Apr 24 22:29:12.905485 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.905378 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerDied","Data":"e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b"} Apr 24 22:29:12.905485 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.905418 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng" event={"ID":"0a82dc86-d37e-4b02-87ac-6cd0616664c7","Type":"ContainerDied","Data":"2f893bcc6881b355dd18325c16d59caa1a37f98c666c2989db2bf5ac175eecc9"} Apr 24 22:29:12.905485 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.905435 2570 scope.go:117] "RemoveContainer" containerID="f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d" Apr 24 22:29:12.913616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.913598 2570 scope.go:117] "RemoveContainer" containerID="e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b" Apr 24 22:29:12.920547 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.920530 2570 scope.go:117] "RemoveContainer" containerID="34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b" Apr 24 22:29:12.925854 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.925834 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng"] Apr 24 22:29:12.928684 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.928662 2570 scope.go:117] "RemoveContainer" containerID="f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d" Apr 24 22:29:12.928977 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:12.928959 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d\": container with ID starting with f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d not found: ID does not exist" containerID="f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d" Apr 24 22:29:12.929064 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.928985 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d"} err="failed to get container status \"f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d\": rpc error: code = NotFound desc = could not find container \"f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d\": container with ID starting with f0381fedb46ebc57a1cb2b2cd8ebddfd8939b6f342437ed7385b81b4b010ac0d not found: ID does not exist" Apr 24 22:29:12.929064 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.929009 2570 scope.go:117] "RemoveContainer" containerID="e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b" Apr 24 22:29:12.929286 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:12.929271 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b\": container with ID starting with e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b not found: ID does not exist" containerID="e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b" Apr 24 22:29:12.929340 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.929293 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b"} err="failed to get container status \"e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b\": rpc error: code = NotFound desc = could not find container \"e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b\": container with ID starting with e119d2be443506164c425fc48b318e4b72246eab0dbc32cc54c84e37285b317b not found: ID does not exist" Apr 24 22:29:12.929340 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.929308 2570 scope.go:117] "RemoveContainer" containerID="34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b" Apr 24 22:29:12.929556 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:12.929531 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b\": container with ID starting with 34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b not found: ID does not exist" containerID="34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b" Apr 24 22:29:12.929652 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.929562 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b"} err="failed to get container status \"34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b\": rpc error: code = NotFound desc = could not find container \"34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b\": container with ID starting with 34e4bcd4675e63fc6401e38556e06458fe3314276c833abb0c2576025f972c2b not found: ID does not exist" Apr 24 22:29:12.929738 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:12.929722 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75888cbd9c-bqbng"] Apr 24 22:29:14.067373 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:14.067338 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" path="/var/lib/kubelet/pods/0a82dc86-d37e-4b02-87ac-6cd0616664c7/volumes" Apr 24 22:29:16.919702 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:16.919671 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm_42961337-6888-4587-8ae3-9769bbf1fba8/storage-initializer/0.log" Apr 24 22:29:16.920179 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:16.919710 2570 generic.go:358] "Generic (PLEG): container finished" podID="42961337-6888-4587-8ae3-9769bbf1fba8" containerID="63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9" exitCode=1 Apr 24 22:29:16.920179 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:16.919755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" event={"ID":"42961337-6888-4587-8ae3-9769bbf1fba8","Type":"ContainerDied","Data":"63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9"} Apr 24 22:29:17.927074 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:17.927018 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm_42961337-6888-4587-8ae3-9769bbf1fba8/storage-initializer/0.log" Apr 24 22:29:17.927573 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:17.927129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" event={"ID":"42961337-6888-4587-8ae3-9769bbf1fba8","Type":"ContainerStarted","Data":"1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69"} Apr 24 22:29:19.361580 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:19.361548 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm"] Apr 24 22:29:19.362090 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:19.361788 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" containerID="cri-o://1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69" gracePeriod=30 Apr 24 22:29:20.425108 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425073 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96"] Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425364 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kube-rbac-proxy" Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425375 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kube-rbac-proxy" Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425400 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="storage-initializer" Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425407 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="storage-initializer" Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425412 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425418 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" Apr 24 22:29:20.425466 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425461 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kube-rbac-proxy" Apr 24 22:29:20.425705 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.425474 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a82dc86-d37e-4b02-87ac-6cd0616664c7" containerName="kserve-container" Apr 24 22:29:20.428551 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.428533 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.430629 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.430609 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 22:29:20.430629 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.430623 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 24 22:29:20.430774 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.430660 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 24 22:29:20.438952 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.438931 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96"] Apr 24 22:29:20.447636 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.447610 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.447794 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.447661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706635aa-0eb1-45cf-b843-692b86fcab68-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.447794 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.447726 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706635aa-0eb1-45cf-b843-692b86fcab68-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.447917 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.447791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.447917 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.447835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwk2f\" (UniqueName: \"kubernetes.io/projected/706635aa-0eb1-45cf-b843-692b86fcab68-kube-api-access-pwk2f\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.549259 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.549453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706635aa-0eb1-45cf-b843-692b86fcab68-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.549453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706635aa-0eb1-45cf-b843-692b86fcab68-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.549453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.549453 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwk2f\" (UniqueName: \"kubernetes.io/projected/706635aa-0eb1-45cf-b843-692b86fcab68-kube-api-access-pwk2f\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.549874 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549841 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706635aa-0eb1-45cf-b843-692b86fcab68-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.550045 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.549996 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.550116 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.550044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.552007 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.551984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706635aa-0eb1-45cf-b843-692b86fcab68-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.557526 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.557501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwk2f\" (UniqueName: \"kubernetes.io/projected/706635aa-0eb1-45cf-b843-692b86fcab68-kube-api-access-pwk2f\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.739016 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.738923 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:20.857616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.857584 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96"] Apr 24 22:29:20.860453 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:29:20.860407 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706635aa_0eb1_45cf_b843_692b86fcab68.slice/crio-8c0fbcaf3f23b8532ebbbe32b96d1d5c3ce372196b3789d1f8c2ba4854ae3642 WatchSource:0}: Error finding container 8c0fbcaf3f23b8532ebbbe32b96d1d5c3ce372196b3789d1f8c2ba4854ae3642: Status 404 returned error can't find the container with id 8c0fbcaf3f23b8532ebbbe32b96d1d5c3ce372196b3789d1f8c2ba4854ae3642 Apr 24 22:29:20.937472 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.937441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerStarted","Data":"bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c"} Apr 24 22:29:20.937577 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:20.937482 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerStarted","Data":"8c0fbcaf3f23b8532ebbbe32b96d1d5c3ce372196b3789d1f8c2ba4854ae3642"} Apr 24 22:29:21.597343 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.597322 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm_42961337-6888-4587-8ae3-9769bbf1fba8/storage-initializer/1.log" Apr 24 22:29:21.597733 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.597665 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm_42961337-6888-4587-8ae3-9769bbf1fba8/storage-initializer/0.log" Apr 24 22:29:21.597733 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.597725 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:21.658422 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.658383 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7hgh\" (UniqueName: \"kubernetes.io/projected/42961337-6888-4587-8ae3-9769bbf1fba8-kube-api-access-c7hgh\") pod \"42961337-6888-4587-8ae3-9769bbf1fba8\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " Apr 24 22:29:21.658600 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.658448 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42961337-6888-4587-8ae3-9769bbf1fba8-kserve-provision-location\") pod \"42961337-6888-4587-8ae3-9769bbf1fba8\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " Apr 24 22:29:21.658600 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.658478 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42961337-6888-4587-8ae3-9769bbf1fba8-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"42961337-6888-4587-8ae3-9769bbf1fba8\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " Apr 24 22:29:21.658600 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.658511 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls\") pod \"42961337-6888-4587-8ae3-9769bbf1fba8\" (UID: \"42961337-6888-4587-8ae3-9769bbf1fba8\") " Apr 24 22:29:21.658773 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.658695 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42961337-6888-4587-8ae3-9769bbf1fba8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42961337-6888-4587-8ae3-9769bbf1fba8" (UID: "42961337-6888-4587-8ae3-9769bbf1fba8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:29:21.658875 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.658853 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42961337-6888-4587-8ae3-9769bbf1fba8-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "42961337-6888-4587-8ae3-9769bbf1fba8" (UID: "42961337-6888-4587-8ae3-9769bbf1fba8"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:29:21.660653 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.660620 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42961337-6888-4587-8ae3-9769bbf1fba8-kube-api-access-c7hgh" (OuterVolumeSpecName: "kube-api-access-c7hgh") pod "42961337-6888-4587-8ae3-9769bbf1fba8" (UID: "42961337-6888-4587-8ae3-9769bbf1fba8"). InnerVolumeSpecName "kube-api-access-c7hgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:29:21.660759 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.660686 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42961337-6888-4587-8ae3-9769bbf1fba8" (UID: "42961337-6888-4587-8ae3-9769bbf1fba8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:29:21.760068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.759956 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42961337-6888-4587-8ae3-9769bbf1fba8-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:21.760068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.759985 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7hgh\" (UniqueName: \"kubernetes.io/projected/42961337-6888-4587-8ae3-9769bbf1fba8-kube-api-access-c7hgh\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:21.760068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.759995 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42961337-6888-4587-8ae3-9769bbf1fba8-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:21.760068 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.760004 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42961337-6888-4587-8ae3-9769bbf1fba8-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:29:21.941002 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.940975 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm_42961337-6888-4587-8ae3-9769bbf1fba8/storage-initializer/1.log" Apr 24 22:29:21.941358 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.941343 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm_42961337-6888-4587-8ae3-9769bbf1fba8/storage-initializer/0.log" Apr 24 22:29:21.941406 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.941381 2570 generic.go:358] "Generic (PLEG): container finished" podID="42961337-6888-4587-8ae3-9769bbf1fba8" containerID="1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69" exitCode=1 Apr 24 22:29:21.941445 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.941414 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" event={"ID":"42961337-6888-4587-8ae3-9769bbf1fba8","Type":"ContainerDied","Data":"1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69"} Apr 24 22:29:21.941490 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.941454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" event={"ID":"42961337-6888-4587-8ae3-9769bbf1fba8","Type":"ContainerDied","Data":"8f598df44d58827347c99964aba5ac5104fb2416dd0807304caced619a80acdc"} Apr 24 22:29:21.941490 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.941480 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm" Apr 24 22:29:21.941572 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.941493 2570 scope.go:117] "RemoveContainer" containerID="1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69" Apr 24 22:29:21.942897 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.942873 2570 generic.go:358] "Generic (PLEG): container finished" podID="706635aa-0eb1-45cf-b843-692b86fcab68" containerID="bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c" exitCode=0 Apr 24 22:29:21.943047 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.942968 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerDied","Data":"bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c"} Apr 24 22:29:21.949521 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.949505 2570 scope.go:117] "RemoveContainer" containerID="63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9" Apr 24 22:29:21.956846 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.956750 2570 scope.go:117] "RemoveContainer" containerID="1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69" Apr 24 22:29:21.957058 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:21.957039 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69\": container with ID starting with 1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69 not found: ID does not exist" containerID="1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69" Apr 24 22:29:21.957145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.957064 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69"} err="failed to get container status \"1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69\": rpc error: code = NotFound desc = could not find container \"1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69\": container with ID starting with 1c1d2ee68bf9473d84d59c7d06e3078c080c5257595c8a4bbe3d418de08a3b69 not found: ID does not exist" Apr 24 22:29:21.957145 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.957079 2570 scope.go:117] "RemoveContainer" containerID="63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9" Apr 24 22:29:21.957294 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:29:21.957277 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9\": container with ID starting with 63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9 not found: ID does not exist" containerID="63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9" Apr 24 22:29:21.957356 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.957300 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9"} err="failed to get container status \"63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9\": rpc error: code = NotFound desc = could not find container \"63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9\": container with ID starting with 63753ac2546332ce38c14d657affe274dfd803151ca31a2edf27064fc9d9b3c9 not found: ID does not exist" Apr 24 22:29:21.992206 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.992180 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm"] Apr 24 22:29:21.995315 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:21.995289 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6bbcff75f-qktjm"] Apr 24 22:29:22.066916 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:22.066887 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" path="/var/lib/kubelet/pods/42961337-6888-4587-8ae3-9769bbf1fba8/volumes" Apr 24 22:29:22.949138 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:22.949105 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerStarted","Data":"e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16"} Apr 24 22:29:22.949138 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:22.949136 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerStarted","Data":"c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e"} Apr 24 22:29:22.949683 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:22.949371 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:22.968305 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:22.968263 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podStartSLOduration=2.968249653 podStartE2EDuration="2.968249653s" podCreationTimestamp="2026-04-24 22:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:22.967092032 +0000 UTC m=+3715.390599439" watchObservedRunningTime="2026-04-24 22:29:22.968249653 +0000 UTC m=+3715.391757060" Apr 24 22:29:23.952003 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:23.951970 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:23.953186 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:23.953162 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:29:24.955182 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:24.955146 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:29:29.959442 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:29.959412 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:29:29.960014 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:29.959989 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:29:39.959975 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:39.959929 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:29:49.960499 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:49.960462 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:29:59.960816 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:29:59.960774 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:30:09.960764 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:09.960721 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:30:19.960126 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:19.960039 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 24 22:30:29.960523 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:29.960490 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:30:30.441358 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:30.441329 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96"] Apr 24 22:30:30.441651 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:30.441622 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" containerID="cri-o://c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e" gracePeriod=30 Apr 24 22:30:30.441803 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:30.441658 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kube-rbac-proxy" containerID="cri-o://e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16" gracePeriod=30 Apr 24 22:30:31.135873 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.135838 2570 generic.go:358] "Generic (PLEG): container finished" podID="706635aa-0eb1-45cf-b843-692b86fcab68" containerID="e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16" exitCode=2 Apr 24 22:30:31.135873 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.135876 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerDied","Data":"e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16"} Apr 24 22:30:31.519545 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.519504 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km"] Apr 24 22:30:31.519821 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.519796 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" Apr 24 22:30:31.519821 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.519814 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" Apr 24 22:30:31.519971 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.519832 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" Apr 24 22:30:31.519971 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.519838 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" Apr 24 22:30:31.519971 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.519904 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" Apr 24 22:30:31.520114 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.520000 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="42961337-6888-4587-8ae3-9769bbf1fba8" containerName="storage-initializer" Apr 24 22:30:31.522570 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.522551 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.524660 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.524635 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 24 22:30:31.524765 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.524700 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 24 22:30:31.536894 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.536868 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km"] Apr 24 22:30:31.577059 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.577006 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvjwj\" (UniqueName: \"kubernetes.io/projected/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kube-api-access-lvjwj\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.577267 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.577083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.577267 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.577115 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.577267 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.577147 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.677903 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.677853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvjwj\" (UniqueName: \"kubernetes.io/projected/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kube-api-access-lvjwj\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.677903 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.677911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.678182 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.677942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.678182 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.677965 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.678182 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:31.678114 2570 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 22:30:31.678300 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:31.678193 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls podName:b9fb720e-cb8b-4687-a5dc-eb4c28b17658 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:32.17817068 +0000 UTC m=+3784.601678080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" (UID: "b9fb720e-cb8b-4687-a5dc-eb4c28b17658") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 22:30:31.678478 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.678457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.678703 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.678683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:31.686756 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:31.686733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvjwj\" (UniqueName: \"kubernetes.io/projected/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kube-api-access-lvjwj\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:32.181702 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:32.181661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:32.184275 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:32.184255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:32.432480 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:32.432390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:32.556370 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:32.556338 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km"] Apr 24 22:30:32.559876 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:30:32.559851 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fb720e_cb8b_4687_a5dc_eb4c28b17658.slice/crio-de3f65e879134ba67ea2bbdaa8e5f6256e96d3df9cceef78d91bb41846d0720b WatchSource:0}: Error finding container de3f65e879134ba67ea2bbdaa8e5f6256e96d3df9cceef78d91bb41846d0720b: Status 404 returned error can't find the container with id de3f65e879134ba67ea2bbdaa8e5f6256e96d3df9cceef78d91bb41846d0720b Apr 24 22:30:32.562148 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:32.562129 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:30:33.143727 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:33.143693 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" event={"ID":"b9fb720e-cb8b-4687-a5dc-eb4c28b17658","Type":"ContainerStarted","Data":"d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef"} Apr 24 22:30:33.143727 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:33.143733 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" event={"ID":"b9fb720e-cb8b-4687-a5dc-eb4c28b17658","Type":"ContainerStarted","Data":"de3f65e879134ba67ea2bbdaa8e5f6256e96d3df9cceef78d91bb41846d0720b"} Apr 24 22:30:34.668125 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.668104 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:30:34.698303 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.698275 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706635aa-0eb1-45cf-b843-692b86fcab68-proxy-tls\") pod \"706635aa-0eb1-45cf-b843-692b86fcab68\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " Apr 24 22:30:34.698404 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.698323 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwk2f\" (UniqueName: \"kubernetes.io/projected/706635aa-0eb1-45cf-b843-692b86fcab68-kube-api-access-pwk2f\") pod \"706635aa-0eb1-45cf-b843-692b86fcab68\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " Apr 24 22:30:34.698404 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.698385 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706635aa-0eb1-45cf-b843-692b86fcab68-kserve-provision-location\") pod \"706635aa-0eb1-45cf-b843-692b86fcab68\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " Apr 24 22:30:34.698755 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.698720 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706635aa-0eb1-45cf-b843-692b86fcab68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "706635aa-0eb1-45cf-b843-692b86fcab68" (UID: "706635aa-0eb1-45cf-b843-692b86fcab68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:30:34.701076 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.701051 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706635aa-0eb1-45cf-b843-692b86fcab68-kube-api-access-pwk2f" (OuterVolumeSpecName: "kube-api-access-pwk2f") pod "706635aa-0eb1-45cf-b843-692b86fcab68" (UID: "706635aa-0eb1-45cf-b843-692b86fcab68"). InnerVolumeSpecName "kube-api-access-pwk2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:30:34.701076 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.701052 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706635aa-0eb1-45cf-b843-692b86fcab68-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "706635aa-0eb1-45cf-b843-692b86fcab68" (UID: "706635aa-0eb1-45cf-b843-692b86fcab68"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:30:34.799665 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799610 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-cabundle-cert\") pod \"706635aa-0eb1-45cf-b843-692b86fcab68\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " Apr 24 22:30:34.799665 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799637 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"706635aa-0eb1-45cf-b843-692b86fcab68\" (UID: \"706635aa-0eb1-45cf-b843-692b86fcab68\") " Apr 24 22:30:34.799815 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799727 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/706635aa-0eb1-45cf-b843-692b86fcab68-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:34.799815 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799737 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwk2f\" (UniqueName: \"kubernetes.io/projected/706635aa-0eb1-45cf-b843-692b86fcab68-kube-api-access-pwk2f\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:34.799815 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799748 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/706635aa-0eb1-45cf-b843-692b86fcab68-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:34.799927 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799909 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "706635aa-0eb1-45cf-b843-692b86fcab68" (UID: "706635aa-0eb1-45cf-b843-692b86fcab68"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:30:34.800016 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.799996 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "706635aa-0eb1-45cf-b843-692b86fcab68" (UID: "706635aa-0eb1-45cf-b843-692b86fcab68"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:30:34.900596 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.900574 2570 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-cabundle-cert\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:34.900596 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:34.900597 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/706635aa-0eb1-45cf-b843-692b86fcab68-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:35.151225 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.151126 2570 generic.go:358] "Generic (PLEG): container finished" podID="706635aa-0eb1-45cf-b843-692b86fcab68" containerID="c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e" exitCode=0 Apr 24 22:30:35.151225 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.151169 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerDied","Data":"c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e"} Apr 24 22:30:35.151225 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.151197 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" event={"ID":"706635aa-0eb1-45cf-b843-692b86fcab68","Type":"ContainerDied","Data":"8c0fbcaf3f23b8532ebbbe32b96d1d5c3ce372196b3789d1f8c2ba4854ae3642"} Apr 24 22:30:35.151225 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.151204 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96" Apr 24 22:30:35.151225 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.151216 2570 scope.go:117] "RemoveContainer" containerID="e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16" Apr 24 22:30:35.159367 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.159352 2570 scope.go:117] "RemoveContainer" containerID="c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e" Apr 24 22:30:35.166260 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.166243 2570 scope.go:117] "RemoveContainer" containerID="bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c" Apr 24 22:30:35.171218 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.171196 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96"] Apr 24 22:30:35.173914 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.173900 2570 scope.go:117] "RemoveContainer" containerID="e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16" Apr 24 22:30:35.174182 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:35.174165 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16\": container with ID starting with e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16 not found: ID does not exist" containerID="e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16" Apr 24 22:30:35.174236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.174192 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16"} err="failed to get container status \"e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16\": rpc error: code = NotFound desc = could not find container \"e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16\": container with ID starting with e02f8b9719eab6e8870bb0decae94bb620125120bf68233bbb9467bab3735b16 not found: ID does not exist" Apr 24 22:30:35.174236 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.174215 2570 scope.go:117] "RemoveContainer" containerID="c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e" Apr 24 22:30:35.174437 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:35.174414 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e\": container with ID starting with c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e not found: ID does not exist" containerID="c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e" Apr 24 22:30:35.174567 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.174447 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e"} err="failed to get container status \"c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e\": rpc error: code = NotFound desc = could not find container \"c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e\": container with ID starting with c781f03d2c1f6e1bff41690c20c09f80ac97be1f7011b1e60f4dcff1ca5a2c9e not found: ID does not exist" Apr 24 22:30:35.174567 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.174469 2570 scope.go:117] "RemoveContainer" containerID="bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c" Apr 24 22:30:35.174721 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:35.174705 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c\": container with ID starting with bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c not found: ID does not exist" containerID="bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c" Apr 24 22:30:35.174764 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.174725 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c"} err="failed to get container status \"bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c\": rpc error: code = NotFound desc = could not find container \"bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c\": container with ID starting with bb23f9ffb25138fac4ccf8a0f90884fd6573ea52d6d29002eb450460c9c13a8c not found: ID does not exist" Apr 24 22:30:35.176569 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:35.176549 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-77554c6fb4-69p96"] Apr 24 22:30:36.067691 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:36.067660 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" path="/var/lib/kubelet/pods/706635aa-0eb1-45cf-b843-692b86fcab68/volumes" Apr 24 22:30:38.166645 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:38.166618 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km_b9fb720e-cb8b-4687-a5dc-eb4c28b17658/storage-initializer/0.log" Apr 24 22:30:38.167047 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:38.166656 2570 generic.go:358] "Generic (PLEG): container finished" podID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerID="d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef" exitCode=1 Apr 24 22:30:38.167047 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:38.166700 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" event={"ID":"b9fb720e-cb8b-4687-a5dc-eb4c28b17658","Type":"ContainerDied","Data":"d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef"} Apr 24 22:30:39.170708 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:39.170680 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km_b9fb720e-cb8b-4687-a5dc-eb4c28b17658/storage-initializer/0.log" Apr 24 22:30:39.171111 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:39.170760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" event={"ID":"b9fb720e-cb8b-4687-a5dc-eb4c28b17658","Type":"ContainerStarted","Data":"fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128"} Apr 24 22:30:41.521786 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:41.521755 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km"] Apr 24 22:30:41.522274 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:41.522121 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" containerID="cri-o://fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128" gracePeriod=30 Apr 24 22:30:43.964701 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:43.964679 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km_b9fb720e-cb8b-4687-a5dc-eb4c28b17658/storage-initializer/1.log" Apr 24 22:30:43.965051 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:43.965013 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km_b9fb720e-cb8b-4687-a5dc-eb4c28b17658/storage-initializer/0.log" Apr 24 22:30:43.965107 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:43.965098 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:44.068919 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.068837 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls\") pod \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " Apr 24 22:30:44.068919 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.068895 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvjwj\" (UniqueName: \"kubernetes.io/projected/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kube-api-access-lvjwj\") pod \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " Apr 24 22:30:44.069168 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.068928 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kserve-provision-location\") pod \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " Apr 24 22:30:44.069168 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.068958 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\" (UID: \"b9fb720e-cb8b-4687-a5dc-eb4c28b17658\") " Apr 24 22:30:44.069276 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.069208 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b9fb720e-cb8b-4687-a5dc-eb4c28b17658" (UID: "b9fb720e-cb8b-4687-a5dc-eb4c28b17658"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:30:44.069362 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.069336 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "b9fb720e-cb8b-4687-a5dc-eb4c28b17658" (UID: "b9fb720e-cb8b-4687-a5dc-eb4c28b17658"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:30:44.071104 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.071085 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kube-api-access-lvjwj" (OuterVolumeSpecName: "kube-api-access-lvjwj") pod "b9fb720e-cb8b-4687-a5dc-eb4c28b17658" (UID: "b9fb720e-cb8b-4687-a5dc-eb4c28b17658"). InnerVolumeSpecName "kube-api-access-lvjwj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:30:44.071198 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.071121 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b9fb720e-cb8b-4687-a5dc-eb4c28b17658" (UID: "b9fb720e-cb8b-4687-a5dc-eb4c28b17658"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:30:44.170282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.170248 2570 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-proxy-tls\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:44.170282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.170275 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvjwj\" (UniqueName: \"kubernetes.io/projected/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kube-api-access-lvjwj\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:44.170282 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.170285 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-kserve-provision-location\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:44.170517 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.170295 2570 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9fb720e-cb8b-4687-a5dc-eb4c28b17658-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-230.ec2.internal\" DevicePath \"\"" Apr 24 22:30:44.185396 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185373 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km_b9fb720e-cb8b-4687-a5dc-eb4c28b17658/storage-initializer/1.log" Apr 24 22:30:44.185677 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185666 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km_b9fb720e-cb8b-4687-a5dc-eb4c28b17658/storage-initializer/0.log" Apr 24 22:30:44.185738 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185700 2570 generic.go:358] "Generic (PLEG): container finished" podID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerID="fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128" exitCode=1 Apr 24 22:30:44.185777 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185733 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" event={"ID":"b9fb720e-cb8b-4687-a5dc-eb4c28b17658","Type":"ContainerDied","Data":"fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128"} Apr 24 22:30:44.185777 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" event={"ID":"b9fb720e-cb8b-4687-a5dc-eb4c28b17658","Type":"ContainerDied","Data":"de3f65e879134ba67ea2bbdaa8e5f6256e96d3df9cceef78d91bb41846d0720b"} Apr 24 22:30:44.185845 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185779 2570 scope.go:117] "RemoveContainer" containerID="fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128" Apr 24 22:30:44.185845 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.185790 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km" Apr 24 22:30:44.193822 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.193796 2570 scope.go:117] "RemoveContainer" containerID="d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef" Apr 24 22:30:44.200693 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.200675 2570 scope.go:117] "RemoveContainer" containerID="fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128" Apr 24 22:30:44.200935 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:44.200917 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128\": container with ID starting with fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128 not found: ID does not exist" containerID="fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128" Apr 24 22:30:44.200983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.200944 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128"} err="failed to get container status \"fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128\": rpc error: code = NotFound desc = could not find container \"fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128\": container with ID starting with fb7dc3b740df12ef5bc07c5dab94e778a9f1514e8c28005e6398762ffcab6128 not found: ID does not exist" Apr 24 22:30:44.200983 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.200962 2570 scope.go:117] "RemoveContainer" containerID="d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef" Apr 24 22:30:44.201223 ip-10-0-129-230 kubenswrapper[2570]: E0424 22:30:44.201207 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef\": container with ID starting with d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef not found: ID does not exist" containerID="d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef" Apr 24 22:30:44.201270 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.201227 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef"} err="failed to get container status \"d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef\": rpc error: code = NotFound desc = could not find container \"d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef\": container with ID starting with d8b08ff66143769f92c07b3c0b63845f5adcdfaa4c42b0ca48abdf9584efd1ef not found: ID does not exist" Apr 24 22:30:44.224187 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.224164 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km"] Apr 24 22:30:44.227692 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:44.227672 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6bcff8d486-rj2km"] Apr 24 22:30:46.067270 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:30:46.067226 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" path="/var/lib/kubelet/pods/b9fb720e-cb8b-4687-a5dc-eb4c28b17658/volumes" Apr 24 22:31:11.113683 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.113647 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-scnl7/must-gather-n7zqv"] Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.113969 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kube-rbac-proxy" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.113982 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kube-rbac-proxy" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.113998 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114004 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114011 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114017 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114038 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114044 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114054 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114061 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114107 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kserve-container" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114117 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="706635aa-0eb1-45cf-b843-692b86fcab68" containerName="kube-rbac-proxy" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114124 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" Apr 24 22:31:11.114205 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.114197 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9fb720e-cb8b-4687-a5dc-eb4c28b17658" containerName="storage-initializer" Apr 24 22:31:11.116940 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.116924 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.119519 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.119497 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-scnl7\"/\"kube-root-ca.crt\"" Apr 24 22:31:11.120227 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.120210 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-scnl7\"/\"default-dockercfg-vdl5x\"" Apr 24 22:31:11.120227 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.120213 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-scnl7\"/\"openshift-service-ca.crt\"" Apr 24 22:31:11.123766 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.123693 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/must-gather-n7zqv"] Apr 24 22:31:11.169219 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.169187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjqr\" (UniqueName: \"kubernetes.io/projected/a9ed8b00-1fa1-4b4c-a89c-ec223b98d108-kube-api-access-nhjqr\") pod \"must-gather-n7zqv\" (UID: \"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108\") " pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.169342 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.169229 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9ed8b00-1fa1-4b4c-a89c-ec223b98d108-must-gather-output\") pod \"must-gather-n7zqv\" (UID: \"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108\") " pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.270651 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.270626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjqr\" (UniqueName: \"kubernetes.io/projected/a9ed8b00-1fa1-4b4c-a89c-ec223b98d108-kube-api-access-nhjqr\") pod \"must-gather-n7zqv\" (UID: \"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108\") " pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.270775 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.270661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9ed8b00-1fa1-4b4c-a89c-ec223b98d108-must-gather-output\") pod \"must-gather-n7zqv\" (UID: \"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108\") " pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.270935 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.270921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a9ed8b00-1fa1-4b4c-a89c-ec223b98d108-must-gather-output\") pod \"must-gather-n7zqv\" (UID: \"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108\") " pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.295474 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.295446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjqr\" (UniqueName: \"kubernetes.io/projected/a9ed8b00-1fa1-4b4c-a89c-ec223b98d108-kube-api-access-nhjqr\") pod \"must-gather-n7zqv\" (UID: \"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108\") " pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.426382 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.426362 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/must-gather-n7zqv" Apr 24 22:31:11.547403 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:11.547368 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/must-gather-n7zqv"] Apr 24 22:31:11.550240 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:31:11.550211 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ed8b00_1fa1_4b4c_a89c_ec223b98d108.slice/crio-267183658bbde0378b4106fae2fb2ef86363bd730de4b53925be9c4ad7144f08 WatchSource:0}: Error finding container 267183658bbde0378b4106fae2fb2ef86363bd730de4b53925be9c4ad7144f08: Status 404 returned error can't find the container with id 267183658bbde0378b4106fae2fb2ef86363bd730de4b53925be9c4ad7144f08 Apr 24 22:31:12.264417 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:12.264385 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/must-gather-n7zqv" event={"ID":"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108","Type":"ContainerStarted","Data":"267183658bbde0378b4106fae2fb2ef86363bd730de4b53925be9c4ad7144f08"} Apr 24 22:31:13.270010 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:13.269966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/must-gather-n7zqv" event={"ID":"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108","Type":"ContainerStarted","Data":"186c4a9ea5ac27f5c0c8291cfaa75921cc057d5074319a46d96f1e41977f8095"} Apr 24 22:31:13.270010 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:13.270035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/must-gather-n7zqv" event={"ID":"a9ed8b00-1fa1-4b4c-a89c-ec223b98d108","Type":"ContainerStarted","Data":"76f5383895d241db204f8a6f18440c1da943f3fdd11f13928ba4e58cc02fbbe6"} Apr 24 22:31:13.287324 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:13.287255 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-scnl7/must-gather-n7zqv" podStartSLOduration=1.392932307 podStartE2EDuration="2.287239597s" podCreationTimestamp="2026-04-24 22:31:11 +0000 UTC" firstStartedPulling="2026-04-24 22:31:11.551896304 +0000 UTC m=+3823.975403689" lastFinishedPulling="2026-04-24 22:31:12.446203591 +0000 UTC m=+3824.869710979" observedRunningTime="2026-04-24 22:31:13.285688951 +0000 UTC m=+3825.709196451" watchObservedRunningTime="2026-04-24 22:31:13.287239597 +0000 UTC m=+3825.710747004" Apr 24 22:31:13.955117 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:13.955085 2570 ???:1] "http: TLS handshake error from 10.0.136.160:54692: EOF" Apr 24 22:31:13.962963 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:13.962930 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-2sfj6_bb780578-f495-4afb-a817-59c177c8993a/global-pull-secret-syncer/0.log" Apr 24 22:31:14.141199 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:14.141155 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-f5p8z_1b7be731-9db3-42b5-831e-c40b988f32aa/konnectivity-agent/0.log" Apr 24 22:31:14.207450 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:14.207378 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-230.ec2.internal_7f83006f9b8c8c12eeba4e850f965db6/haproxy/0.log" Apr 24 22:31:18.106407 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:18.106371 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5wq4b_428c6e21-0255-4e74-bbeb-9e2fbbffbb1c/node-exporter/0.log" Apr 24 22:31:18.126791 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:18.126762 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5wq4b_428c6e21-0255-4e74-bbeb-9e2fbbffbb1c/kube-rbac-proxy/0.log" Apr 24 22:31:18.151937 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:18.151912 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5wq4b_428c6e21-0255-4e74-bbeb-9e2fbbffbb1c/init-textfile/0.log" Apr 24 22:31:19.985913 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:19.985886 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-xm65x_755a83ac-6e0b-4533-8c76-435876e1c64e/networking-console-plugin/0.log" Apr 24 22:31:21.156819 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.156778 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv"] Apr 24 22:31:21.161281 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.161246 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.168075 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.168042 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv"] Apr 24 22:31:21.252855 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.252808 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-lib-modules\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.253070 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.252871 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-proc\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.253070 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.252905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjpt\" (UniqueName: \"kubernetes.io/projected/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-kube-api-access-vhjpt\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.253070 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.252931 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-sys\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.253070 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.253015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-podres\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.353848 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.353812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-podres\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.353848 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.353851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-lib-modules\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.353896 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-proc\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.353921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhjpt\" (UniqueName: \"kubernetes.io/projected/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-kube-api-access-vhjpt\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.353942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-sys\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.354013 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-proc\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.354071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-sys\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354131 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.354127 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-podres\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.354358 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.354125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-lib-modules\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.361768 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.361745 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhjpt\" (UniqueName: \"kubernetes.io/projected/f9f0cad8-a78f-4992-aaa0-4de8c44ef062-kube-api-access-vhjpt\") pod \"perf-node-gather-daemonset-rhjzv\" (UID: \"f9f0cad8-a78f-4992-aaa0-4de8c44ef062\") " pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.474980 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.474947 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:21.617532 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.617490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv"] Apr 24 22:31:21.622685 ip-10-0-129-230 kubenswrapper[2570]: W0424 22:31:21.622656 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf9f0cad8_a78f_4992_aaa0_4de8c44ef062.slice/crio-52f4459788b7395a7068c232a2bb387b83edc827ff5a42e0c53aba28f9928a30 WatchSource:0}: Error finding container 52f4459788b7395a7068c232a2bb387b83edc827ff5a42e0c53aba28f9928a30: Status 404 returned error can't find the container with id 52f4459788b7395a7068c232a2bb387b83edc827ff5a42e0c53aba28f9928a30 Apr 24 22:31:21.984486 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:21.984408 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9nj6p_0c465288-2f3f-4fc1-9192-76111e546363/dns/0.log" Apr 24 22:31:22.004616 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.004591 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9nj6p_0c465288-2f3f-4fc1-9192-76111e546363/kube-rbac-proxy/0.log" Apr 24 22:31:22.093839 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.093816 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6lvzd_e90ff02b-a473-4732-a712-a6377d84bf43/dns-node-resolver/0.log" Apr 24 22:31:22.305828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.305747 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" event={"ID":"f9f0cad8-a78f-4992-aaa0-4de8c44ef062","Type":"ContainerStarted","Data":"52374161b13c10667578bc23e41725b875d30722e4a09070dc2ebfa8b1cc0567"} Apr 24 22:31:22.305828 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.305792 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" event={"ID":"f9f0cad8-a78f-4992-aaa0-4de8c44ef062","Type":"ContainerStarted","Data":"52f4459788b7395a7068c232a2bb387b83edc827ff5a42e0c53aba28f9928a30"} Apr 24 22:31:22.306324 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.305874 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:22.324488 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.324429 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" podStartSLOduration=1.324409513 podStartE2EDuration="1.324409513s" podCreationTimestamp="2026-04-24 22:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:31:22.323361122 +0000 UTC m=+3834.746868525" watchObservedRunningTime="2026-04-24 22:31:22.324409513 +0000 UTC m=+3834.747916921" Apr 24 22:31:22.564499 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:22.564425 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dxhb2_2331d294-90e6-4527-bfaa-8f3913c788e1/node-ca/0.log" Apr 24 22:31:23.628096 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:23.628071 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p4k7x_4878b13f-f2f3-43a4-a7ed-f6b5b7c0acb5/serve-healthcheck-canary/0.log" Apr 24 22:31:24.122697 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:24.122664 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lrbvd_7e457ae3-8c68-4737-9278-09ce86be5d5d/kube-rbac-proxy/0.log" Apr 24 22:31:24.146139 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:24.146114 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lrbvd_7e457ae3-8c68-4737-9278-09ce86be5d5d/exporter/0.log" Apr 24 22:31:24.166399 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:24.166375 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lrbvd_7e457ae3-8c68-4737-9278-09ce86be5d5d/extractor/0.log" Apr 24 22:31:26.288569 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:26.288544 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-r4gw9_09fda931-98d2-45f5-a0b5-8d8302f3e524/server/0.log" Apr 24 22:31:26.545624 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:26.545553 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-dtlvr_d3f40481-9db3-448a-97e9-675a97385be5/manager/0.log" Apr 24 22:31:26.612809 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:26.612784 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-gvthv_c98e1427-1554-4779-bd5e-b7aece1c6da5/s3-tls-init-serving/0.log" Apr 24 22:31:26.643295 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:26.643274 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-v9bn7_4f14a23f-acdc-436a-b4bc-b44e9ff88871/seaweedfs/0.log" Apr 24 22:31:26.666490 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:26.666467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-5gk7n_c778c90e-bec7-4db6-83cd-81ff3c204a34/seaweedfs-tls-custom/0.log" Apr 24 22:31:26.695544 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:26.695520 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-gcpxd_3bafcdab-0482-4bf0-bf70-c504030e1650/seaweedfs-tls-serving/0.log" Apr 24 22:31:28.320148 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:28.320122 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-scnl7/perf-node-gather-daemonset-rhjzv" Apr 24 22:31:31.971330 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:31.971298 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/kube-multus-additional-cni-plugins/0.log" Apr 24 22:31:31.992119 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:31.992098 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/egress-router-binary-copy/0.log" Apr 24 22:31:32.012879 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.012856 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/cni-plugins/0.log" Apr 24 22:31:32.034148 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.034125 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/bond-cni-plugin/0.log" Apr 24 22:31:32.054727 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.054695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/routeoverride-cni/0.log" Apr 24 22:31:32.080893 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.080861 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/whereabouts-cni-bincopy/0.log" Apr 24 22:31:32.103653 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.103622 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-74945_a30efbd8-e476-46d3-a06f-675f751559d5/whereabouts-cni/0.log" Apr 24 22:31:32.531950 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.531917 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gpzd8_6db0669e-da93-4120-888c-ab35559e48f8/kube-multus/0.log" Apr 24 22:31:32.582495 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.582472 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-892qf_87439f5a-542b-48ed-980f-a2183de13b6f/network-metrics-daemon/0.log" Apr 24 22:31:32.610083 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:32.610059 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-892qf_87439f5a-542b-48ed-980f-a2183de13b6f/kube-rbac-proxy/0.log" Apr 24 22:31:33.422561 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.422530 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-controller/0.log" Apr 24 22:31:33.440891 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.440844 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/0.log" Apr 24 22:31:33.461675 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.461648 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovn-acl-logging/1.log" Apr 24 22:31:33.481158 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.481121 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/kube-rbac-proxy-node/0.log" Apr 24 22:31:33.507091 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.507060 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:31:33.527387 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.527365 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/northd/0.log" Apr 24 22:31:33.549244 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.549224 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/nbdb/0.log" Apr 24 22:31:33.570909 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.570890 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/sbdb/0.log" Apr 24 22:31:33.692515 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:33.692424 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6c6jh_7203859a-3465-4088-aaf1-c39a752936b3/ovnkube-controller/0.log" Apr 24 22:31:35.446711 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:35.446685 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bjr9v_1a1d36cf-baab-4f24-a9d1-4dde21da6db3/network-check-target-container/0.log" Apr 24 22:31:36.356702 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:36.356674 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-vmvcc_f3e83c89-0e34-4b6d-aa0b-98737298d3d7/iptables-alerter/0.log" Apr 24 22:31:37.052722 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:37.052695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vrsl9_7b219b45-d422-43ea-8238-d0f73e8f85e3/tuned/0.log" Apr 24 22:31:40.358773 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:40.358704 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-xrdbh_f6fa6e75-2769-408c-a21d-d05ee9ab8ea3/csi-driver/0.log" Apr 24 22:31:40.379719 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:40.379696 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-xrdbh_f6fa6e75-2769-408c-a21d-d05ee9ab8ea3/csi-node-driver-registrar/0.log" Apr 24 22:31:40.400570 ip-10-0-129-230 kubenswrapper[2570]: I0424 22:31:40.400544 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-xrdbh_f6fa6e75-2769-408c-a21d-d05ee9ab8ea3/csi-liveness-probe/0.log"