Apr 24 23:53:59.597204 ip-10-0-129-109 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:54:00.058359 ip-10-0-129-109 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:54:00.058359 ip-10-0-129-109 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:54:00.058359 ip-10-0-129-109 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:54:00.058359 ip-10-0-129-109 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:54:00.058359 ip-10-0-129-109 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:54:00.059162 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.059080 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:54:00.062147 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062131 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:54:00.062147 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062147 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062151 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062154 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062157 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062160 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062163 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062166 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062168 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062171 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062174 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062177 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062180 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062187 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062190 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062193 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062195 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062198 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062202 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062204 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062207 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:54:00.062211 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062209 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062212 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062215 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062218 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062221 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062224 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062226 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062229 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062231 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062234 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062237 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062239 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062242 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062245 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062247 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062250 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062252 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062255 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062259 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062262 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:54:00.062672 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062265 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062267 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062269 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062272 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062275 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062277 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062279 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062282 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062284 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062286 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062289 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062291 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062293 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062297 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062301 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062304 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062307 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062310 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062312 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:54:00.063176 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062314 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062317 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062319 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062322 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062325 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062327 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062332 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062335 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062338 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062340 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062342 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062346 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062350 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062353 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062357 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062360 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062363 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062365 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062368 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:54:00.063663 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062370 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062372 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062375 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062377 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062380 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062382 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062385 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062765 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062772 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062775 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062777 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062781 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062785 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062789 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062791 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062794 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062797 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062799 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062802 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062806 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:54:00.064135 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062808 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062811 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062813 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062816 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062818 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062820 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062823 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062825 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062828 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062830 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062833 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062835 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062838 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062840 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062842 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062845 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062847 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062849 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062852 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062855 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:54:00.064630 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062858 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062860 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062863 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062865 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062868 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062870 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062874 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062877 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062880 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062883 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062886 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062888 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062891 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062893 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062896 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062898 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062901 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062903 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062906 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062908 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:54:00.065128 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062910 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062931 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062935 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062939 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062943 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062947 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062950 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062952 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062955 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062959 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062961 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062964 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062967 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062970 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062972 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062975 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062977 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062980 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062982 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062985 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:54:00.065666 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062987 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062989 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062992 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062994 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.062997 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063000 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063003 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063005 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063007 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063010 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063012 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063014 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063017 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063082 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063089 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063095 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063100 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063104 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063107 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063112 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063116 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:54:00.066179 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063119 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063122 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063126 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063129 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063132 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063136 2576 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063139 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063141 2576 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063144 2576 flags.go:64] FLAG: --cloud-config="" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063147 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063150 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063153 2576 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063156 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063159 2576 flags.go:64] FLAG: --config-dir="" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063161 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063164 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063168 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063171 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063174 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063177 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063180 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063183 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063186 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063190 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063192 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:54:00.066762 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063198 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063200 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063203 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063206 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063209 2576 flags.go:64] FLAG: --enable-server="true" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063211 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063216 2576 flags.go:64] FLAG: --event-burst="100" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063219 2576 flags.go:64] FLAG: --event-qps="50" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063222 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063225 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063228 2576 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063232 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063234 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063237 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063240 2576 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063243 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063246 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063248 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063251 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063254 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063257 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063259 2576 flags.go:64] FLAG: --feature-gates="" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063263 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063266 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063269 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:54:00.067383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063272 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063275 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063278 2576 flags.go:64] FLAG: --help="false" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063281 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063283 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063286 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063289 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063292 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063296 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063298 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063301 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063304 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063307 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063310 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063313 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063316 2576 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063318 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063321 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063324 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063327 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063329 2576 flags.go:64] FLAG: --lock-file="" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063332 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063335 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063337 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:54:00.067992 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063343 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063345 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063348 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063351 2576 flags.go:64] FLAG: --logging-format="text" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063353 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063356 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063359 2576 flags.go:64] FLAG: --manifest-url="" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063362 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063366 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063369 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063374 2576 flags.go:64] FLAG: --max-pods="110" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063377 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063380 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063382 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063385 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063388 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063392 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063394 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063401 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063405 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063408 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063410 2576 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063413 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:54:00.068569 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063418 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063421 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063424 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063427 2576 flags.go:64] FLAG: --port="10250" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063430 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063433 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-069572017c0ea9d45" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063436 2576 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063439 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063442 2576 flags.go:64] FLAG: --register-node="true" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063444 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063447 2576 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063451 2576 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063454 2576 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063456 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063459 2576 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063462 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063465 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063468 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063470 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063473 2576 flags.go:64] FLAG: --runonce="false" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063476 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063480 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063483 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063485 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063488 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063491 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:54:00.069146 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063494 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063501 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063504 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063507 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063509 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063512 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063515 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063518 2576 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063520 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063526 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063528 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063531 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063534 2576 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063537 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063539 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063542 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063545 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063548 2576 flags.go:64] FLAG: --v="2" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063552 2576 flags.go:64] FLAG: --version="false" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063556 2576 flags.go:64] FLAG: --vmodule="" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063559 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.063562 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063660 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063665 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:54:00.069819 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063669 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063672 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063679 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063682 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063685 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063687 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063692 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063695 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063698 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063702 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063704 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063707 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063709 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063712 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063715 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063717 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063720 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063722 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063726 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:54:00.070424 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063730 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063732 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063735 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063737 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063740 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063742 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063745 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063747 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063750 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063752 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063754 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063757 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063759 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063761 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063764 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063766 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063769 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063772 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063774 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063778 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:54:00.070895 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063780 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063782 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063786 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063788 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063791 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063793 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063796 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063798 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063806 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063808 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063811 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063813 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063815 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063818 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063820 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063823 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063825 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063827 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063830 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063832 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:54:00.071410 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063835 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063837 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063839 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063842 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063844 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063847 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063849 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063852 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063855 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063863 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063865 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063869 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063871 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063873 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063877 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063879 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063882 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063884 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063887 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063889 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:54:00.071954 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063892 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063894 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063897 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063899 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.063901 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.064564 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.071500 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.071516 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071564 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071569 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071572 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071575 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071577 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071580 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071583 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071586 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:54:00.072449 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071588 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071591 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071593 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071596 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071598 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071601 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071603 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071606 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071609 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071611 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071614 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071616 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071619 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071621 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071623 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071626 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071628 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071631 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071633 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071636 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:54:00.072844 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071638 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071641 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071644 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071647 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071650 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071653 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071656 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071658 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071660 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071663 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071665 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071668 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071672 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071675 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071680 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071683 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071686 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071689 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071692 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:54:00.073384 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071695 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071697 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071700 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071704 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071707 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071709 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071712 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071714 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071717 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071719 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071721 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071724 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071727 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071730 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071732 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071735 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071738 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071747 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071750 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:54:00.073835 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071753 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071755 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071758 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071760 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071763 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071765 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071768 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071770 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071772 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071775 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071777 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071779 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071782 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071784 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071786 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071789 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071791 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071794 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071797 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:54:00.074325 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071800 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.071805 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071961 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071966 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071970 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071973 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071975 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071978 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071982 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071984 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071987 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071989 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.071997 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072000 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072002 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072004 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:54:00.074822 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072007 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072009 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072012 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072014 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072017 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072019 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072022 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072024 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072026 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072028 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072031 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072034 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072036 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072038 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072041 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072046 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072049 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072052 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072055 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:54:00.075242 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072058 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072060 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072063 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072065 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072068 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072070 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072073 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072075 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072077 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072080 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072094 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072096 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072099 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072101 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072103 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072106 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072108 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072110 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072113 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072115 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:54:00.075706 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072118 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072120 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072122 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072125 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072127 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072129 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072132 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072135 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072137 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072140 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072142 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072145 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072147 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072150 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072152 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072154 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072157 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072159 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072161 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072164 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:54:00.076230 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072166 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072168 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072171 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072178 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072181 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072183 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072186 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072188 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072190 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072193 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072196 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072199 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:00.072202 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.072207 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.073099 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:54:00.076718 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.075573 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:54:00.077107 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.076406 2576 server.go:1019] "Starting client certificate rotation" Apr 24 23:54:00.077107 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.076521 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:54:00.077107 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.076558 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:54:00.106358 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.106337 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:54:00.110652 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.110628 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:54:00.123178 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.123158 2576 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:54:00.129036 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.129022 2576 log.go:25] "Validated CRI v1 image API" Apr 24 23:54:00.130222 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.130207 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:54:00.132705 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.132686 2576 fs.go:135] Filesystem UUIDs: map[45975ba1-071a-4bf1-b46a-f9b735fee952:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8faecf8e-f941-4ece-832d-f13dea24169d:/dev/nvme0n1p4] Apr 24 23:54:00.132773 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.132705 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:54:00.133138 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.133122 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:54:00.139525 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.139411 2576 manager.go:217] Machine: {Timestamp:2026-04-24 23:54:00.13749706 +0000 UTC m=+0.418613971 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104530 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2874928bbaa860726a6528fa935378 SystemUUID:ec287492-8bba-a860-726a-6528fa935378 BootID:7afb8ad8-7791-42e0-b7df-de69ae7e6865 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7d:96:bd:1a:21 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7d:96:bd:1a:21 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:2c:e7:89:93:ec Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:54:00.140269 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.140258 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:54:00.140377 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.140366 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:54:00.142849 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.142824 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:54:00.143016 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.142853 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-109.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:54:00.143065 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.143030 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:54:00.143065 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.143039 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:54:00.143065 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.143052 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:54:00.144038 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.144027 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:54:00.144857 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.144848 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:54:00.144990 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.144981 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:54:00.148183 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.148173 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:54:00.148221 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.148186 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:54:00.148221 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.148203 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:54:00.148221 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.148212 2576 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:54:00.148302 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.148236 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:54:00.149429 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.149418 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:54:00.149465 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.149442 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:54:00.153133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.153107 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:54:00.153805 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.153778 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rl6nx" Apr 24 23:54:00.155150 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.155135 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:54:00.157030 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157016 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157034 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157043 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157052 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157061 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157070 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157079 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:54:00.157099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157099 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:54:00.157277 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157109 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:54:00.157277 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157118 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:54:00.157277 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157127 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:54:00.157277 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157137 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:54:00.157965 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157951 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:54:00.158024 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.157970 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:54:00.158162 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.158136 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-109.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:54:00.158312 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.158290 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:54:00.161804 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.161791 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:54:00.161843 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.161822 2576 server.go:1295] "Started kubelet" Apr 24 23:54:00.161884 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.161790 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rl6nx" Apr 24 23:54:00.161965 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.161902 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:54:00.162028 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.161989 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:54:00.162272 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.162251 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:54:00.162620 ip-10-0-129-109 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:54:00.163937 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.163897 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:54:00.165175 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.165159 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:54:00.171139 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.171123 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-109.ec2.internal" not found Apr 24 23:54:00.171231 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.171206 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:54:00.172082 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.172026 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:54:00.174585 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.174426 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:54:00.174696 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.174684 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:54:00.174947 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.174928 2576 factory.go:55] Registering systemd factory Apr 24 23:54:00.175032 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.174727 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:54:00.175032 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.174999 2576 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:54:00.175123 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.174715 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.175266 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175232 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:00.175403 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175382 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:54:00.175403 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175400 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:54:00.175827 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175808 2576 factory.go:153] Registering CRI-O factory Apr 24 23:54:00.175909 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175830 2576 factory.go:223] Registration of the crio container factory successfully Apr 24 23:54:00.175990 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175944 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:54:00.175990 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175975 2576 factory.go:103] Registering Raw factory Apr 24 23:54:00.176092 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.175991 2576 manager.go:1196] Started watching for new ooms in manager Apr 24 23:54:00.176863 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.176847 2576 manager.go:319] Starting recovery of all containers Apr 24 23:54:00.177813 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.177792 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-109.ec2.internal\" not found" node="ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.178550 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.178528 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:54:00.187576 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.187554 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-109.ec2.internal" not found Apr 24 23:54:00.188866 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.188851 2576 manager.go:324] Recovery completed Apr 24 23:54:00.192649 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.192636 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:54:00.194942 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.194909 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:54:00.195005 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.194956 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:54:00.195005 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.194966 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:54:00.195442 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.195421 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:54:00.195442 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.195434 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:54:00.195523 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.195451 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:54:00.197778 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.197767 2576 policy_none.go:49] "None policy: Start" Apr 24 23:54:00.197840 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.197783 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:54:00.197840 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.197792 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:54:00.243493 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243476 2576 manager.go:341] "Starting Device Plugin manager" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.243515 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243525 2576 server.go:85] "Starting device plugin registration server" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243731 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243750 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243820 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243891 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.243900 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.244393 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:54:00.244743 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.244431 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.245664 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.245649 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-109.ec2.internal" not found Apr 24 23:54:00.309396 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.309336 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:54:00.310814 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.310792 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:54:00.310814 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.310818 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:54:00.310995 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.310845 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:54:00.310995 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.310855 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:54:00.310995 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.310886 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:54:00.312839 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.312820 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:00.343854 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.343817 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:54:00.344811 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.344793 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:54:00.344940 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.344834 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:54:00.344940 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.344849 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:54:00.344940 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.344877 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.354111 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.354093 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.354213 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.354120 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-109.ec2.internal\": node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.373386 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.373364 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.411690 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.411667 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal"] Apr 24 23:54:00.411744 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.411734 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:54:00.412612 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.412599 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:54:00.412676 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.412623 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:54:00.412676 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.412633 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:54:00.413876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.413848 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:54:00.414028 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.414072 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414043 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:54:00.414849 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414833 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:54:00.414946 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414865 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:54:00.414946 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414879 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:54:00.414946 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414835 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:54:00.414946 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414946 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:54:00.415135 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.414955 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:54:00.416317 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.416301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.416365 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.416328 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:54:00.417978 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.417964 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:54:00.418043 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.417994 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:54:00.418043 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.418017 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:54:00.445533 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.445517 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-109.ec2.internal\" not found" node="ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.449782 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.449769 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-109.ec2.internal\" not found" node="ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.473447 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.473426 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.477597 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.477580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17afc3e90e99d1cc5da6f4eb47b9540d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal\" (UID: \"17afc3e90e99d1cc5da6f4eb47b9540d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.477662 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.477604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17afc3e90e99d1cc5da6f4eb47b9540d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal\" (UID: \"17afc3e90e99d1cc5da6f4eb47b9540d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.477662 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.477622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d2e34ff8c6ad1ea6f4df1f49c9e13e6c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-109.ec2.internal\" (UID: \"d2e34ff8c6ad1ea6f4df1f49c9e13e6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.574508 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.574451 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.578752 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.578734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17afc3e90e99d1cc5da6f4eb47b9540d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal\" (UID: \"17afc3e90e99d1cc5da6f4eb47b9540d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.578814 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.578699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17afc3e90e99d1cc5da6f4eb47b9540d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal\" (UID: \"17afc3e90e99d1cc5da6f4eb47b9540d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.578850 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.578821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17afc3e90e99d1cc5da6f4eb47b9540d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal\" (UID: \"17afc3e90e99d1cc5da6f4eb47b9540d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.578850 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.578843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d2e34ff8c6ad1ea6f4df1f49c9e13e6c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-109.ec2.internal\" (UID: \"d2e34ff8c6ad1ea6f4df1f49c9e13e6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.578938 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.578867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d2e34ff8c6ad1ea6f4df1f49c9e13e6c-config\") pod \"kube-apiserver-proxy-ip-10-0-129-109.ec2.internal\" (UID: \"d2e34ff8c6ad1ea6f4df1f49c9e13e6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.578938 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.578903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17afc3e90e99d1cc5da6f4eb47b9540d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal\" (UID: \"17afc3e90e99d1cc5da6f4eb47b9540d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.674851 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.674822 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.748257 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.748222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.752964 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:00.752942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" Apr 24 23:54:00.775879 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.775854 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.876435 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.876367 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:00.976975 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:00.976944 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:01.076476 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.076447 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:54:01.077129 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.076570 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:54:01.077129 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.076578 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:54:01.077528 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:01.077513 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:01.164325 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.164250 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:49:00 +0000 UTC" deadline="2028-01-27 17:10:04.90596586 +0000 UTC" Apr 24 23:54:01.164325 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.164284 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15425h16m3.741685413s" Apr 24 23:54:01.171379 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.171355 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:54:01.178491 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:01.178467 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:01.196132 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.196105 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:54:01.215078 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:01.215049 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17afc3e90e99d1cc5da6f4eb47b9540d.slice/crio-dcccd30cfcb0d7b391306d0ab61cfdb270412b7a0c81ce75f43b101dbe324e53 WatchSource:0}: Error finding container dcccd30cfcb0d7b391306d0ab61cfdb270412b7a0c81ce75f43b101dbe324e53: Status 404 returned error can't find the container with id dcccd30cfcb0d7b391306d0ab61cfdb270412b7a0c81ce75f43b101dbe324e53 Apr 24 23:54:01.215460 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:01.215438 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e34ff8c6ad1ea6f4df1f49c9e13e6c.slice/crio-8abeebf72f0b2f8a8375f766998763052bc6a721d712d67f45a8e02ec2c9485f WatchSource:0}: Error finding container 8abeebf72f0b2f8a8375f766998763052bc6a721d712d67f45a8e02ec2c9485f: Status 404 returned error can't find the container with id 8abeebf72f0b2f8a8375f766998763052bc6a721d712d67f45a8e02ec2c9485f Apr 24 23:54:01.219998 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.219986 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:54:01.232517 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.232493 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r2sfg" Apr 24 23:54:01.239723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.239705 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r2sfg" Apr 24 23:54:01.279326 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:01.279302 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:01.314246 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.314201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" event={"ID":"17afc3e90e99d1cc5da6f4eb47b9540d","Type":"ContainerStarted","Data":"dcccd30cfcb0d7b391306d0ab61cfdb270412b7a0c81ce75f43b101dbe324e53"} Apr 24 23:54:01.314902 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.314881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" event={"ID":"d2e34ff8c6ad1ea6f4df1f49c9e13e6c","Type":"ContainerStarted","Data":"8abeebf72f0b2f8a8375f766998763052bc6a721d712d67f45a8e02ec2c9485f"} Apr 24 23:54:01.380287 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:01.380263 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:01.434941 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.434876 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:01.481264 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:01.481234 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-109.ec2.internal\" not found" Apr 24 23:54:01.536289 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.536257 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:01.573436 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.573401 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" Apr 24 23:54:01.584690 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.584656 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:54:01.585501 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.585481 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" Apr 24 23:54:01.594280 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.594256 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:54:01.935031 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:01.935002 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:02.149975 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.149944 2576 apiserver.go:52] "Watching apiserver" Apr 24 23:54:02.157831 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.157800 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:54:02.158295 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.158271 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk","openshift-cluster-node-tuning-operator/tuned-6lz9z","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal","openshift-multus/multus-g9gll","openshift-multus/network-metrics-daemon-fdw8f","openshift-network-operator/iptables-alerter-qkrqf","openshift-dns/node-resolver-58s9z","openshift-image-registry/node-ca-6fc98","openshift-multus/multus-additional-cni-plugins-brnc4","openshift-network-diagnostics/network-check-target-mthk5","openshift-ovn-kubernetes/ovnkube-node-27ksn","kube-system/konnectivity-agent-48h4q"] Apr 24 23:54:02.159876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.159858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.161133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.161108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.162241 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.162218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.162812 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.162790 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:54:02.163083 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163064 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:54:02.163334 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163318 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.163418 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hhhm8\"" Apr 24 23:54:02.163599 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163579 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:54:02.163599 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163593 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.163877 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163859 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:54:02.163988 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.163967 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.164323 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.164308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7zkw6\"" Apr 24 23:54:02.164323 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.164317 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.164487 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.164472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.164572 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.164551 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.164660 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.164590 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xvdhn\"" Apr 24 23:54:02.164800 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.164784 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.165682 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.165653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.165777 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.165736 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:02.167100 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.167077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9nv4r\"" Apr 24 23:54:02.167231 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.167213 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:54:02.167379 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.167361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.169217 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.168975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.169217 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.169064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.170100 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.170081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cv7z5\"" Apr 24 23:54:02.170212 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.170117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.170212 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.170149 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.170434 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.170414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:02.170509 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.170488 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:02.171401 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.171384 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.171715 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.171693 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.171715 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.171693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n7rsm\"" Apr 24 23:54:02.171844 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.171722 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.171844 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.171703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.172066 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.172043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:54:02.172160 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.172113 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.172236 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.172218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:54:02.172309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.172043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8s2v9\"" Apr 24 23:54:02.172902 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.172884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.174095 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.174076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:54:02.174790 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.174768 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:54:02.174953 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.174774 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:54:02.175027 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.174961 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:54:02.175027 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.175008 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:54:02.175186 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.175171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-94qv5\"" Apr 24 23:54:02.175186 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.175183 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dcgnw\"" Apr 24 23:54:02.175291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.175232 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:54:02.175368 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.175352 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:54:02.175482 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.175470 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:54:02.176108 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.176082 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:54:02.188940 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.188879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysctl-d\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.188940 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.188905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-k8s-cni-cncf-io\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.189085 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.188944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9pt\" (UniqueName: \"kubernetes.io/projected/ad8d234f-a974-4a38-8d63-b660058eeb43-kube-api-access-kl9pt\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.189085 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.188976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-kubelet\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189085 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fj4g\" (UniqueName: \"kubernetes.io/projected/f800603e-9119-44c9-9253-07fb97437cd7-kube-api-access-9fj4g\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.189085 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-cnibin\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.189085 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-cni-binary-copy\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.189085 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/808b8c86-7996-4c7e-b677-dc648c7c5598-host\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-var-lib-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-ovn\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-cni-netd\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysctl-conf\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-env-overrides\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovn-node-metrics-cert\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovnkube-script-lib\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/038b3357-ba5f-4aa6-8bda-d7a61161c9ce-konnectivity-ca\") pod \"konnectivity-agent-48h4q\" (UID: \"038b3357-ba5f-4aa6-8bda-d7a61161c9ce\") " pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/68924e4d-1b30-4887-a8bc-c624385685df-hosts-file\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-host\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.189366 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/808b8c86-7996-4c7e-b677-dc648c7c5598-serviceca\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-modprobe-d\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-system-cni-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/038b3357-ba5f-4aa6-8bda-d7a61161c9ce-agent-certs\") pod \"konnectivity-agent-48h4q\" (UID: \"038b3357-ba5f-4aa6-8bda-d7a61161c9ce\") " pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-cnibin\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-multus-certs\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68924e4d-1b30-4887-a8bc-c624385685df-tmp-dir\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-run\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-run-ovn-kubernetes\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-cni-bin\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p6wx\" (UniqueName: \"kubernetes.io/projected/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-kube-api-access-7p6wx\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfqlc\" (UniqueName: \"kubernetes.io/projected/68924e4d-1b30-4887-a8bc-c624385685df-kube-api-access-tfqlc\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsm7z\" (UniqueName: \"kubernetes.io/projected/6633b011-7fd6-404a-b15b-b4d8f7c11aba-kube-api-access-zsm7z\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-cni-multus\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-conf-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-sys\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.190029 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-slash\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-os-release\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-cni-binary-copy\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.189963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-hostroot\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-socket-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-run-netns\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-cni-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-netns\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-daemon-config\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad8d234f-a974-4a38-8d63-b660058eeb43-host-slash\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-systemd\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovnkube-config\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-etc-selinux\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysconfig\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.190848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-var-lib-kubelet\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-etc-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-socket-dir-parent\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8h2s\" (UniqueName: \"kubernetes.io/projected/808b8c86-7996-4c7e-b677-dc648c7c5598-kube-api-access-n8h2s\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v959\" (UniqueName: \"kubernetes.io/projected/64f94142-36c9-443f-988d-974f8671f7fe-kube-api-access-7v959\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6633b011-7fd6-404a-b15b-b4d8f7c11aba-tmp\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-systemd-units\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-node-log\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-cni-bin\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj2kt\" (UniqueName: \"kubernetes.io/projected/27ed6ad4-863b-4379-8e79-0244d71ad92d-kube-api-access-kj2kt\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-sys-fs\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-lib-modules\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9pd6\" (UniqueName: \"kubernetes.io/projected/6206bc2d-d85c-4007-8a04-e9eb243f590c-kube-api-access-q9pd6\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190593 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-log-socket\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.191598 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-os-release\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-etc-kubernetes\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-registration-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-kubernetes\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-tuned\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-systemd\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-system-cni-dir\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190806 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-kubelet\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad8d234f-a974-4a38-8d63-b660058eeb43-iptables-alerter-script\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.192349 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.190870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-device-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.240476 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.240448 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:49:01 +0000 UTC" deadline="2027-12-28 04:15:21.772077021 +0000 UTC" Apr 24 23:54:02.240573 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.240475 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14692h21m19.531604942s" Apr 24 23:54:02.291937 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.291895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-registration-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.291948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-kubernetes\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.291970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-tuned\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.291989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-systemd\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-system-cni-dir\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-registration-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-kubelet\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292055 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad8d234f-a974-4a38-8d63-b660058eeb43-iptables-alerter-script\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-systemd\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-system-cni-dir\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292078 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-device-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-kubernetes\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-kubelet\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292126 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysctl-d\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-k8s-cni-cncf-io\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9pt\" (UniqueName: \"kubernetes.io/projected/ad8d234f-a974-4a38-8d63-b660058eeb43-kube-api-access-kl9pt\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-kubelet\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fj4g\" (UniqueName: \"kubernetes.io/projected/f800603e-9119-44c9-9253-07fb97437cd7-kube-api-access-9fj4g\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-device-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-cnibin\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-cni-binary-copy\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-k8s-cni-cncf-io\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/808b8c86-7996-4c7e-b677-dc648c7c5598-host\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.292291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysctl-d\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-var-lib-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-ovn\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-cnibin\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-cni-netd\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-kubelet\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysctl-conf\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-env-overrides\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovn-node-metrics-cert\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovnkube-script-lib\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/038b3357-ba5f-4aa6-8bda-d7a61161c9ce-konnectivity-ca\") pod \"konnectivity-agent-48h4q\" (UID: \"038b3357-ba5f-4aa6-8bda-d7a61161c9ce\") " pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-cni-netd\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292501 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/808b8c86-7996-4c7e-b677-dc648c7c5598-host\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad8d234f-a974-4a38-8d63-b660058eeb43-iptables-alerter-script\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-var-lib-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/68924e4d-1b30-4887-a8bc-c624385685df-hosts-file\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-ovn\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.292813 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-host\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/808b8c86-7996-4c7e-b677-dc648c7c5598-serviceca\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-modprobe-d\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-system-cni-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/038b3357-ba5f-4aa6-8bda-d7a61161c9ce-agent-certs\") pod \"konnectivity-agent-48h4q\" (UID: \"038b3357-ba5f-4aa6-8bda-d7a61161c9ce\") " pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-cnibin\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-multus-certs\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68924e4d-1b30-4887-a8bc-c624385685df-tmp-dir\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-run\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-run-ovn-kubernetes\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-cni-bin\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p6wx\" (UniqueName: \"kubernetes.io/projected/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-kube-api-access-7p6wx\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292898 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfqlc\" (UniqueName: \"kubernetes.io/projected/68924e4d-1b30-4887-a8bc-c624385685df-kube-api-access-tfqlc\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsm7z\" (UniqueName: \"kubernetes.io/projected/6633b011-7fd6-404a-b15b-b4d8f7c11aba-kube-api-access-zsm7z\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.292979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-cni-multus\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.293451 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-conf-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-cni-binary-copy\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-sys\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-env-overrides\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-slash\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysctl-conf\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-os-release\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/68924e4d-1b30-4887-a8bc-c624385685df-hosts-file\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-cni-binary-copy\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-hostroot\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-socket-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-run-netns\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-cni-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-netns\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-daemon-config\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.294309 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad8d234f-a974-4a38-8d63-b660058eeb43-host-slash\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-systemd\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovnkube-config\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-etc-selinux\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovnkube-script-lib\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysconfig\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-host\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/038b3357-ba5f-4aa6-8bda-d7a61161c9ce-konnectivity-ca\") pod \"konnectivity-agent-48h4q\" (UID: \"038b3357-ba5f-4aa6-8bda-d7a61161c9ce\") " pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-var-lib-kubelet\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-run-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-var-lib-kubelet\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-etc-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-run-netns\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-etc-openvswitch\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-socket-dir-parent\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.295227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8h2s\" (UniqueName: \"kubernetes.io/projected/808b8c86-7996-4c7e-b677-dc648c7c5598-kube-api-access-n8h2s\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v959\" (UniqueName: \"kubernetes.io/projected/64f94142-36c9-443f-988d-974f8671f7fe-kube-api-access-7v959\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6633b011-7fd6-404a-b15b-b4d8f7c11aba-tmp\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-cni-multus\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-conf-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/808b8c86-7996-4c7e-b677-dc648c7c5598-serviceca\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-sys\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-systemd-units\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-slash\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.293619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-socket-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-modprobe-d\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-node-log\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-cni-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-cni-bin\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-system-cni-dir\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj2kt\" (UniqueName: \"kubernetes.io/projected/27ed6ad4-863b-4379-8e79-0244d71ad92d-kube-api-access-kj2kt\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296096 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-sys-fs\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-lib-modules\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9pd6\" (UniqueName: \"kubernetes.io/projected/6206bc2d-d85c-4007-8a04-e9eb243f590c-kube-api-access-q9pd6\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-log-socket\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-os-release\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-socket-dir-parent\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-etc-kubernetes\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-etc-kubernetes\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-os-release\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-multus-daemon-config\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad8d234f-a974-4a38-8d63-b660058eeb43-host-slash\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-cni-binary-copy\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-systemd\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-hostroot\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-netns\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-node-log\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.296889 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.294979 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.294988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-cni-bin\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-etc-selinux\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.295064 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:02.795018004 +0000 UTC m=+3.076134923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-sysconfig\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-log-socket\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/64f94142-36c9-443f-988d-974f8671f7fe-sys-fs\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-systemd-units\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-lib-modules\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6633b011-7fd6-404a-b15b-b4d8f7c11aba-run\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-run-multus-certs\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-cnibin\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f800603e-9119-44c9-9253-07fb97437cd7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27ed6ad4-863b-4379-8e79-0244d71ad92d-host-run-ovn-kubernetes\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovnkube-config\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-host-var-lib-cni-bin\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-os-release\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.297723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/68924e4d-1b30-4887-a8bc-c624385685df-tmp-dir\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.298588 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.295706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f800603e-9119-44c9-9253-07fb97437cd7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.298588 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.296961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/27ed6ad4-863b-4379-8e79-0244d71ad92d-ovn-node-metrics-cert\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.298588 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.297120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6633b011-7fd6-404a-b15b-b4d8f7c11aba-etc-tuned\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.298588 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.297626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6633b011-7fd6-404a-b15b-b4d8f7c11aba-tmp\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.299536 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.299518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/038b3357-ba5f-4aa6-8bda-d7a61161c9ce-agent-certs\") pod \"konnectivity-agent-48h4q\" (UID: \"038b3357-ba5f-4aa6-8bda-d7a61161c9ce\") " pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.302979 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.302904 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:02.302979 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.302967 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:02.302979 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.302982 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:02.302979 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.303043 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:02.803026482 +0000 UTC m=+3.084143391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:02.305168 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.304962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9pt\" (UniqueName: \"kubernetes.io/projected/ad8d234f-a974-4a38-8d63-b660058eeb43-kube-api-access-kl9pt\") pod \"iptables-alerter-qkrqf\" (UID: \"ad8d234f-a974-4a38-8d63-b660058eeb43\") " pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.305811 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.305789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj2kt\" (UniqueName: \"kubernetes.io/projected/27ed6ad4-863b-4379-8e79-0244d71ad92d-kube-api-access-kj2kt\") pod \"ovnkube-node-27ksn\" (UID: \"27ed6ad4-863b-4379-8e79-0244d71ad92d\") " pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.305934 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.305870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p6wx\" (UniqueName: \"kubernetes.io/projected/2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f-kube-api-access-7p6wx\") pod \"multus-g9gll\" (UID: \"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f\") " pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.307433 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.307404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8h2s\" (UniqueName: \"kubernetes.io/projected/808b8c86-7996-4c7e-b677-dc648c7c5598-kube-api-access-n8h2s\") pod \"node-ca-6fc98\" (UID: \"808b8c86-7996-4c7e-b677-dc648c7c5598\") " pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.307581 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.307524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9pd6\" (UniqueName: \"kubernetes.io/projected/6206bc2d-d85c-4007-8a04-e9eb243f590c-kube-api-access-q9pd6\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.308116 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.308096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v959\" (UniqueName: \"kubernetes.io/projected/64f94142-36c9-443f-988d-974f8671f7fe-kube-api-access-7v959\") pod \"aws-ebs-csi-driver-node-5h2dk\" (UID: \"64f94142-36c9-443f-988d-974f8671f7fe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.308320 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.308298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fj4g\" (UniqueName: \"kubernetes.io/projected/f800603e-9119-44c9-9253-07fb97437cd7-kube-api-access-9fj4g\") pod \"multus-additional-cni-plugins-brnc4\" (UID: \"f800603e-9119-44c9-9253-07fb97437cd7\") " pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.308709 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.308692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsm7z\" (UniqueName: \"kubernetes.io/projected/6633b011-7fd6-404a-b15b-b4d8f7c11aba-kube-api-access-zsm7z\") pod \"tuned-6lz9z\" (UID: \"6633b011-7fd6-404a-b15b-b4d8f7c11aba\") " pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.313502 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.313485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfqlc\" (UniqueName: \"kubernetes.io/projected/68924e4d-1b30-4887-a8bc-c624385685df-kube-api-access-tfqlc\") pod \"node-resolver-58s9z\" (UID: \"68924e4d-1b30-4887-a8bc-c624385685df\") " pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.431060 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.431026 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:54:02.473329 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.473253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brnc4" Apr 24 23:54:02.482120 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.482095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" Apr 24 23:54:02.488849 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.488826 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" Apr 24 23:54:02.495472 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.495450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g9gll" Apr 24 23:54:02.500537 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.500520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58s9z" Apr 24 23:54:02.508986 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.508969 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qkrqf" Apr 24 23:54:02.514482 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.514466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6fc98" Apr 24 23:54:02.521062 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.521044 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:02.529645 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.529625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:02.798333 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.798292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:02.798501 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.798426 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:02.798501 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.798485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.798468842 +0000 UTC m=+4.079585746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:02.880105 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.880077 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb3d0c7_68be_4f8d_b00f_77ab92cbb94f.slice/crio-f61be11a22b1f6f2da520ee19e53d44ceba7187bd1665a7eb82283237471e012 WatchSource:0}: Error finding container f61be11a22b1f6f2da520ee19e53d44ceba7187bd1665a7eb82283237471e012: Status 404 returned error can't find the container with id f61be11a22b1f6f2da520ee19e53d44ceba7187bd1665a7eb82283237471e012 Apr 24 23:54:02.881888 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.881742 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf800603e_9119_44c9_9253_07fb97437cd7.slice/crio-f5091e2288d1977713c3838a9182b4021bc882651239ccec65d66b8e1a7474a3 WatchSource:0}: Error finding container f5091e2288d1977713c3838a9182b4021bc882651239ccec65d66b8e1a7474a3: Status 404 returned error can't find the container with id f5091e2288d1977713c3838a9182b4021bc882651239ccec65d66b8e1a7474a3 Apr 24 23:54:02.885195 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.885174 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68924e4d_1b30_4887_a8bc_c624385685df.slice/crio-d1712ff3bb456fbada9733e2066fa40920e8dc684b9ef237cec19fdfdf6de3f9 WatchSource:0}: Error finding container d1712ff3bb456fbada9733e2066fa40920e8dc684b9ef237cec19fdfdf6de3f9: Status 404 returned error can't find the container with id d1712ff3bb456fbada9733e2066fa40920e8dc684b9ef237cec19fdfdf6de3f9 Apr 24 23:54:02.886487 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.886232 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod038b3357_ba5f_4aa6_8bda_d7a61161c9ce.slice/crio-aff4bea2a0a9ff6ef6057b237a01e37578d3ea320d3d4f5e84b166045ea247c8 WatchSource:0}: Error finding container aff4bea2a0a9ff6ef6057b237a01e37578d3ea320d3d4f5e84b166045ea247c8: Status 404 returned error can't find the container with id aff4bea2a0a9ff6ef6057b237a01e37578d3ea320d3d4f5e84b166045ea247c8 Apr 24 23:54:02.887025 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.886948 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6633b011_7fd6_404a_b15b_b4d8f7c11aba.slice/crio-94d2225e7d67f19636c85d1395d28bdcb693bca98e94aef72c0b7d16bb85ea54 WatchSource:0}: Error finding container 94d2225e7d67f19636c85d1395d28bdcb693bca98e94aef72c0b7d16bb85ea54: Status 404 returned error can't find the container with id 94d2225e7d67f19636c85d1395d28bdcb693bca98e94aef72c0b7d16bb85ea54 Apr 24 23:54:02.888756 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.888352 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad8d234f_a974_4a38_8d63_b660058eeb43.slice/crio-f339ee92a070c332de30464a3fb22de9b17abdcc3e30452db970d22beab816af WatchSource:0}: Error finding container f339ee92a070c332de30464a3fb22de9b17abdcc3e30452db970d22beab816af: Status 404 returned error can't find the container with id f339ee92a070c332de30464a3fb22de9b17abdcc3e30452db970d22beab816af Apr 24 23:54:02.888875 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:02.888853 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f94142_36c9_443f_988d_974f8671f7fe.slice/crio-d6d7e727ad254e36817261c33f0c85e8ed4f83a15754ffc728424380e1d770d9 WatchSource:0}: Error finding container d6d7e727ad254e36817261c33f0c85e8ed4f83a15754ffc728424380e1d770d9: Status 404 returned error can't find the container with id d6d7e727ad254e36817261c33f0c85e8ed4f83a15754ffc728424380e1d770d9 Apr 24 23:54:02.899307 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:02.899287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:02.899435 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.899421 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:02.899501 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.899441 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:02.899501 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.899454 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:02.899636 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:02.899509 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.899490026 +0000 UTC m=+4.180606938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:03.244151 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.244021 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:49:01 +0000 UTC" deadline="2028-02-05 23:08:35.617194196 +0000 UTC" Apr 24 23:54:03.244151 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.244080 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15647h14m32.3731179s" Apr 24 23:54:03.311249 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.311176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:03.311416 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.311382 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:03.328363 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.328329 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerStarted","Data":"f5091e2288d1977713c3838a9182b4021bc882651239ccec65d66b8e1a7474a3"} Apr 24 23:54:03.331534 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.331466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"5cb4deb05d9075a309030d383d1495ca6ecaaee0e2b495ca20b65155c65652ed"} Apr 24 23:54:03.334678 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.334654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" event={"ID":"64f94142-36c9-443f-988d-974f8671f7fe","Type":"ContainerStarted","Data":"d6d7e727ad254e36817261c33f0c85e8ed4f83a15754ffc728424380e1d770d9"} Apr 24 23:54:03.342491 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.342460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qkrqf" event={"ID":"ad8d234f-a974-4a38-8d63-b660058eeb43","Type":"ContainerStarted","Data":"f339ee92a070c332de30464a3fb22de9b17abdcc3e30452db970d22beab816af"} Apr 24 23:54:03.347952 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.347910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58s9z" event={"ID":"68924e4d-1b30-4887-a8bc-c624385685df","Type":"ContainerStarted","Data":"d1712ff3bb456fbada9733e2066fa40920e8dc684b9ef237cec19fdfdf6de3f9"} Apr 24 23:54:03.357908 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.357884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g9gll" event={"ID":"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f","Type":"ContainerStarted","Data":"f61be11a22b1f6f2da520ee19e53d44ceba7187bd1665a7eb82283237471e012"} Apr 24 23:54:03.383163 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.383132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" event={"ID":"d2e34ff8c6ad1ea6f4df1f49c9e13e6c","Type":"ContainerStarted","Data":"4175cde89de17403ea738011fbd7a7acc2c365794505e8f1d216618a420c7844"} Apr 24 23:54:03.387200 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.387172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6fc98" event={"ID":"808b8c86-7996-4c7e-b677-dc648c7c5598","Type":"ContainerStarted","Data":"c962aaf198a39131675b9dd70df79f086a112fa60b0c6bbe178a283fec3e40c2"} Apr 24 23:54:03.390836 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.390810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" event={"ID":"6633b011-7fd6-404a-b15b-b4d8f7c11aba","Type":"ContainerStarted","Data":"94d2225e7d67f19636c85d1395d28bdcb693bca98e94aef72c0b7d16bb85ea54"} Apr 24 23:54:03.393587 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.393546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-48h4q" event={"ID":"038b3357-ba5f-4aa6-8bda-d7a61161c9ce","Type":"ContainerStarted","Data":"aff4bea2a0a9ff6ef6057b237a01e37578d3ea320d3d4f5e84b166045ea247c8"} Apr 24 23:54:03.806707 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.806671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:03.806880 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.806804 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:03.806880 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.806865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:05.80684627 +0000 UTC m=+6.087963174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:03.908387 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:03.908357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:03.908554 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.908536 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:03.908629 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.908561 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:03.908629 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.908574 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:03.908629 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:03.908627 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:05.908608764 +0000 UTC m=+6.189725663 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:04.314469 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:04.314437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:04.314941 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:04.314576 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:04.410746 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:04.410710 2576 generic.go:358] "Generic (PLEG): container finished" podID="17afc3e90e99d1cc5da6f4eb47b9540d" containerID="213b287dce12fff19297e0a05d8c222098fa82bbbaf1a19c6c335cddd023b826" exitCode=0 Apr 24 23:54:04.411292 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:04.411262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" event={"ID":"17afc3e90e99d1cc5da6f4eb47b9540d","Type":"ContainerDied","Data":"213b287dce12fff19297e0a05d8c222098fa82bbbaf1a19c6c335cddd023b826"} Apr 24 23:54:04.426763 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:04.426723 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-109.ec2.internal" podStartSLOduration=3.426711445 podStartE2EDuration="3.426711445s" podCreationTimestamp="2026-04-24 23:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:03.39906912 +0000 UTC m=+3.680186041" watchObservedRunningTime="2026-04-24 23:54:04.426711445 +0000 UTC m=+4.707828405" Apr 24 23:54:05.312480 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:05.312003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:05.312480 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.312130 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:05.422764 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:05.422087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" event={"ID":"17afc3e90e99d1cc5da6f4eb47b9540d","Type":"ContainerStarted","Data":"286832a9011d20a1c0fdbd520709fab57857eba4a39c39e7fe4c2e84927ea97a"} Apr 24 23:54:05.438689 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:05.437876 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-109.ec2.internal" podStartSLOduration=4.437858984 podStartE2EDuration="4.437858984s" podCreationTimestamp="2026-04-24 23:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:54:05.436635347 +0000 UTC m=+5.717752267" watchObservedRunningTime="2026-04-24 23:54:05.437858984 +0000 UTC m=+5.718975907" Apr 24 23:54:05.826765 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:05.826199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:05.826765 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.826357 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:05.826765 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.826422 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.826402838 +0000 UTC m=+10.107519741 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:05.927376 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:05.926652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:05.927376 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.926866 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:05.927376 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.926887 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:05.927376 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.926902 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:05.927376 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:05.926978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:09.926958611 +0000 UTC m=+10.208075514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:06.311485 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:06.311400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:06.311649 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:06.311550 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:07.311517 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:07.311490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:07.312022 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:07.311614 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:08.312078 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:08.312034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:08.312614 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:08.312169 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:09.311936 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:09.311888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:09.312116 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.312032 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:09.861526 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:09.861486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:09.861695 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.861664 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:09.861746 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.861732 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.861709878 +0000 UTC m=+18.142826786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:09.963222 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:09.962619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:09.963222 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.962785 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:09.963222 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.962805 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:09.963222 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.962815 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:09.963222 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:09.962868 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:17.962849177 +0000 UTC m=+18.243966087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:10.313062 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:10.313021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:10.313523 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:10.313156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:11.311481 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:11.311445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:11.311760 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:11.311576 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:12.312033 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:12.312002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:12.312472 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:12.312132 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:13.311249 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:13.311220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:13.311417 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:13.311338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:14.312039 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:14.312008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:14.312490 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:14.312195 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:15.311177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:15.311144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:15.311359 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:15.311266 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:16.311711 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:16.311683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:16.312146 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:16.311823 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:17.312075 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:17.312034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:17.312518 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:17.312170 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:17.922661 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:17.922626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:17.922827 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:17.922790 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:17.922885 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:17.922855 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:54:33.922837475 +0000 UTC m=+34.203954373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:18.023603 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:18.023560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:18.023782 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:18.023707 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:18.023782 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:18.023733 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:18.023782 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:18.023746 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:18.023960 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:18.023812 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:34.023792788 +0000 UTC m=+34.304909704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:18.311841 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:18.311811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:18.312101 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:18.311980 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:19.311043 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:19.311011 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:19.311212 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:19.311168 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:20.312758 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.312601 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:20.313375 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:20.312845 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:20.452577 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.452465 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:54:20.452960 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.452938 2576 generic.go:358] "Generic (PLEG): container finished" podID="27ed6ad4-863b-4379-8e79-0244d71ad92d" containerID="0c568977ffa3cd916699112bc4cd21cea41a782546f1dd84a94bdad56f122207" exitCode=1 Apr 24 23:54:20.452960 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.452946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"0eea8d28ab60a4f4c6418eab9fcf47f826fb5413f46a537f2e67c5c37880c17a"} Apr 24 23:54:20.453099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.452975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerDied","Data":"0c568977ffa3cd916699112bc4cd21cea41a782546f1dd84a94bdad56f122207"} Apr 24 23:54:20.453099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.452986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"07d808feb565e1d2009f9b7a85a4739ff60175a864a1682e1ef3b8d591e2c0f6"} Apr 24 23:54:20.455814 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.455795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" event={"ID":"64f94142-36c9-443f-988d-974f8671f7fe","Type":"ContainerStarted","Data":"fb886e71ae260e292388a56e174b87152174c9c0aabd3db61f0cadda4d9f4f3d"} Apr 24 23:54:20.456839 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.456818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58s9z" event={"ID":"68924e4d-1b30-4887-a8bc-c624385685df","Type":"ContainerStarted","Data":"62331f7dcfa1de7250f6fdfbc9853516ca50216f1921be8703c65f273951fd71"} Apr 24 23:54:20.458070 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.458051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g9gll" event={"ID":"2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f","Type":"ContainerStarted","Data":"0dc156f3c952f447290ab138a040774f7c45931fadabae0c8136bd4c9b1d387f"} Apr 24 23:54:20.459345 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.459323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6fc98" event={"ID":"808b8c86-7996-4c7e-b677-dc648c7c5598","Type":"ContainerStarted","Data":"41877224f7a409f70b29cd44ec95f46e518cc8bad103a0272cc7279cbb3b8029"} Apr 24 23:54:20.460515 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.460489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" event={"ID":"6633b011-7fd6-404a-b15b-b4d8f7c11aba","Type":"ContainerStarted","Data":"1f4bae92986f2621de30becc77aa3045569fac0b602f94d57ed3a3b89cc83994"} Apr 24 23:54:20.461682 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.461664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-48h4q" event={"ID":"038b3357-ba5f-4aa6-8bda-d7a61161c9ce","Type":"ContainerStarted","Data":"d98d17093dcf6287f628880747e9738882c35dcbb5b87d363f7734aad908c6fe"} Apr 24 23:54:20.462878 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.462856 2576 generic.go:358] "Generic (PLEG): container finished" podID="f800603e-9119-44c9-9253-07fb97437cd7" containerID="21255cb588fae63fd626113fa44c433fa637f2d80bafca51840d50b0bd2fd92f" exitCode=0 Apr 24 23:54:20.462978 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.462889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerDied","Data":"21255cb588fae63fd626113fa44c433fa637f2d80bafca51840d50b0bd2fd92f"} Apr 24 23:54:20.471460 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.471425 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-58s9z" podStartSLOduration=3.42420095 podStartE2EDuration="20.471416348s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.886692933 +0000 UTC m=+3.167809832" lastFinishedPulling="2026-04-24 23:54:19.933908325 +0000 UTC m=+20.215025230" observedRunningTime="2026-04-24 23:54:20.471178029 +0000 UTC m=+20.752294949" watchObservedRunningTime="2026-04-24 23:54:20.471416348 +0000 UTC m=+20.752533268" Apr 24 23:54:20.485336 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.485305 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-48h4q" podStartSLOduration=11.391203916 podStartE2EDuration="20.48529461s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.888511018 +0000 UTC m=+3.169627916" lastFinishedPulling="2026-04-24 23:54:11.982601699 +0000 UTC m=+12.263718610" observedRunningTime="2026-04-24 23:54:20.484565348 +0000 UTC m=+20.765682268" watchObservedRunningTime="2026-04-24 23:54:20.48529461 +0000 UTC m=+20.766411530" Apr 24 23:54:20.518212 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.518170 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6lz9z" podStartSLOduration=3.46716906 podStartE2EDuration="20.518154685s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.889745318 +0000 UTC m=+3.170862215" lastFinishedPulling="2026-04-24 23:54:19.940730929 +0000 UTC m=+20.221847840" observedRunningTime="2026-04-24 23:54:20.517579571 +0000 UTC m=+20.798696497" watchObservedRunningTime="2026-04-24 23:54:20.518154685 +0000 UTC m=+20.799271658" Apr 24 23:54:20.530871 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.530831 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6fc98" podStartSLOduration=3.485625228 podStartE2EDuration="20.530818429s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.893139371 +0000 UTC m=+3.174256283" lastFinishedPulling="2026-04-24 23:54:19.938332583 +0000 UTC m=+20.219449484" observedRunningTime="2026-04-24 23:54:20.530659317 +0000 UTC m=+20.811776231" watchObservedRunningTime="2026-04-24 23:54:20.530818429 +0000 UTC m=+20.811935348" Apr 24 23:54:20.552147 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:20.552106 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g9gll" podStartSLOduration=3.443858376 podStartE2EDuration="20.552092155s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.882248156 +0000 UTC m=+3.163365068" lastFinishedPulling="2026-04-24 23:54:19.990481935 +0000 UTC m=+20.271598847" observedRunningTime="2026-04-24 23:54:20.552052312 +0000 UTC m=+20.833169232" watchObservedRunningTime="2026-04-24 23:54:20.552092155 +0000 UTC m=+20.833209075" Apr 24 23:54:21.311962 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.311934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:21.312099 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:21.312039 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:21.467738 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.467708 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:54:21.468279 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.468082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"5cf621aa649f5cdabc07faa8927b4f3b4a18c34756f4e877f6b61285fb78addc"} Apr 24 23:54:21.468279 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.468120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"c66c2f9033dc11a94022aa56b1a898e41420e4de3210d96f8ad768013989038a"} Apr 24 23:54:21.468279 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.468134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"f34af6888dafdbbeb836cfb3edf82f5fd36cc43181d7924e72b3d0578fc29ff5"} Apr 24 23:54:21.469766 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.469740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qkrqf" event={"ID":"ad8d234f-a974-4a38-8d63-b660058eeb43","Type":"ContainerStarted","Data":"981e8bc8e8dd161bb77b6fa1bc8e9477daa6fd4d6400c02afade8281b3d63249"} Apr 24 23:54:21.671711 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:21.671672 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:54:22.260093 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:22.259397 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:54:21.671698362Z","UUID":"72ad3eed-b213-4e23-9231-9b6170f87c38","Handler":null,"Name":"","Endpoint":""} Apr 24 23:54:22.263594 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:22.263568 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:54:22.263705 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:22.263603 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:54:22.312073 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:22.312043 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:22.312227 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:22.312189 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:22.473856 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:22.473820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" event={"ID":"64f94142-36c9-443f-988d-974f8671f7fe","Type":"ContainerStarted","Data":"e347f49ab600be9e6cd3af0a535e22e4bd443507a276776911ec21e135fcfd89"} Apr 24 23:54:23.178407 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:23.178141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:23.178960 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:23.178937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:23.194601 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:23.194555 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qkrqf" podStartSLOduration=6.150876897 podStartE2EDuration="23.194542547s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.889999437 +0000 UTC m=+3.171116334" lastFinishedPulling="2026-04-24 23:54:19.933665074 +0000 UTC m=+20.214781984" observedRunningTime="2026-04-24 23:54:21.485145686 +0000 UTC m=+21.766262607" watchObservedRunningTime="2026-04-24 23:54:23.194542547 +0000 UTC m=+23.475659494" Apr 24 23:54:23.311384 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:23.311357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:23.311534 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:23.311462 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:23.475845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:23.475748 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:23.476292 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:23.476258 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-48h4q" Apr 24 23:54:24.311377 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:24.311344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:24.311577 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:24.311477 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:25.311819 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:25.311784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:25.312395 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:25.311873 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:25.481235 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:25.481199 2576 generic.go:358] "Generic (PLEG): container finished" podID="f800603e-9119-44c9-9253-07fb97437cd7" containerID="741e579b7cbcd82fd7cf751bf280cccecfa05296e88fbd80cf43bfb74a18a904" exitCode=0 Apr 24 23:54:25.481381 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:25.481283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerDied","Data":"741e579b7cbcd82fd7cf751bf280cccecfa05296e88fbd80cf43bfb74a18a904"} Apr 24 23:54:25.484110 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:25.484021 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:54:25.484378 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:25.484359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"98fe06ca581569f496c34efdeb2162d97f9aff5f9e2e2b543181182fa610192d"} Apr 24 23:54:25.486036 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:25.486017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" event={"ID":"64f94142-36c9-443f-988d-974f8671f7fe","Type":"ContainerStarted","Data":"b23375bdbe94edca2d1e5683bcd3fd4de0fe33879c3eaadd699cd65b95562884"} Apr 24 23:54:26.312138 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:26.312115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:26.312430 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:26.312211 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:26.489953 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:26.489854 2576 generic.go:358] "Generic (PLEG): container finished" podID="f800603e-9119-44c9-9253-07fb97437cd7" containerID="35af58b415b30be81db0493c00e44e9a0727c31ef05539d1410a699e056ce5d2" exitCode=0 Apr 24 23:54:26.490088 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:26.489946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerDied","Data":"35af58b415b30be81db0493c00e44e9a0727c31ef05539d1410a699e056ce5d2"} Apr 24 23:54:26.511245 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:26.511204 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5h2dk" podStartSLOduration=4.963117935 podStartE2EDuration="26.511191106s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.892496126 +0000 UTC m=+3.173613030" lastFinishedPulling="2026-04-24 23:54:24.440569303 +0000 UTC m=+24.721686201" observedRunningTime="2026-04-24 23:54:25.531074565 +0000 UTC m=+25.812191485" watchObservedRunningTime="2026-04-24 23:54:26.511191106 +0000 UTC m=+26.792308036" Apr 24 23:54:27.312151 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.311995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:27.312503 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:27.312238 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:27.494119 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.494039 2576 generic.go:358] "Generic (PLEG): container finished" podID="f800603e-9119-44c9-9253-07fb97437cd7" containerID="33e452266a85a6e9d56b4362aa1901742cced94b67767cf3db474022728efb03" exitCode=0 Apr 24 23:54:27.494235 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.494118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerDied","Data":"33e452266a85a6e9d56b4362aa1901742cced94b67767cf3db474022728efb03"} Apr 24 23:54:27.497227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.497208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:54:27.497544 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.497522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"c6908b1f3e38a6a3a4c62468d30a48a73d8d7a0946324800920a9fb740638ebc"} Apr 24 23:54:27.497792 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.497778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:27.497874 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.497799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:27.498054 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.498041 2576 scope.go:117] "RemoveContainer" containerID="0c568977ffa3cd916699112bc4cd21cea41a782546f1dd84a94bdad56f122207" Apr 24 23:54:27.513877 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:27.513799 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:28.311868 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.311838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:28.312057 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:28.312009 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:28.503670 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.503640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:54:28.504099 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.503967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" event={"ID":"27ed6ad4-863b-4379-8e79-0244d71ad92d","Type":"ContainerStarted","Data":"68fb271b33b4b34d4d604bf2a01e9fc5e1ca0f00110129167a9c2e43730637db"} Apr 24 23:54:28.504454 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.504434 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:28.522388 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.522361 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:54:28.540637 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.540577 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" podStartSLOduration=11.408610932 podStartE2EDuration="28.540558375s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.893302657 +0000 UTC m=+3.174419555" lastFinishedPulling="2026-04-24 23:54:20.025250086 +0000 UTC m=+20.306366998" observedRunningTime="2026-04-24 23:54:28.537633472 +0000 UTC m=+28.818750392" watchObservedRunningTime="2026-04-24 23:54:28.540558375 +0000 UTC m=+28.821675296" Apr 24 23:54:28.541986 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.541409 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mthk5"] Apr 24 23:54:28.541986 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.541538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:28.541986 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:28.541634 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:28.543931 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.543892 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fdw8f"] Apr 24 23:54:28.544142 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:28.544015 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:28.544142 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:28.544123 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:30.312467 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:30.312430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:30.312848 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:30.312552 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:30.312848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:30.312614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:30.312848 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:30.312697 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:32.311521 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:32.311488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:32.312019 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:32.311630 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:54:32.312019 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:32.311699 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:32.312019 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:32.311833 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mthk5" podUID="acf3640a-1870-4ea5-b4cb-f6e0d7abccf0" Apr 24 23:54:33.086144 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.086116 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-109.ec2.internal" event="NodeReady" Apr 24 23:54:33.086291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.086243 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:33.138761 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.138735 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fpgn9"] Apr 24 23:54:33.152728 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.152710 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6zfbj"] Apr 24 23:54:33.152864 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.152853 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.155199 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.155179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-drk7t\"" Apr 24 23:54:33.155302 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.155201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:33.155358 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.155334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:33.169881 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.169859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fpgn9"] Apr 24 23:54:33.169881 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.169883 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6zfbj"] Apr 24 23:54:33.170017 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.170005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.172614 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.172594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:33.172614 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.172606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:33.172758 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.172629 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:33.172812 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.172756 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gqd2q\"" Apr 24 23:54:33.342997 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.342978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972771be-01b9-4da1-b895-914fde15bc88-config-volume\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.343301 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.343014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q794t\" (UniqueName: \"kubernetes.io/projected/972771be-01b9-4da1-b895-914fde15bc88-kube-api-access-q794t\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.343301 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.343044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/972771be-01b9-4da1-b895-914fde15bc88-tmp-dir\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.343301 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.343066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.343301 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.343124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-657rd\" (UniqueName: \"kubernetes.io/projected/73aaed41-fe6b-4446-8ab2-95e11e051d4b-kube-api-access-657rd\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.343301 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.343151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.444121 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972771be-01b9-4da1-b895-914fde15bc88-config-volume\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.444230 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q794t\" (UniqueName: \"kubernetes.io/projected/972771be-01b9-4da1-b895-914fde15bc88-kube-api-access-q794t\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.444230 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/972771be-01b9-4da1-b895-914fde15bc88-tmp-dir\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.444230 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.444230 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-657rd\" (UniqueName: \"kubernetes.io/projected/73aaed41-fe6b-4446-8ab2-95e11e051d4b-kube-api-access-657rd\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.444230 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.444458 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.444310 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:33.444458 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.444318 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:33.444458 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.444370 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:33.94435176 +0000 UTC m=+34.225468664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:54:33.444458 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.444387 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:54:33.944378813 +0000 UTC m=+34.225495711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:54:33.444723 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/972771be-01b9-4da1-b895-914fde15bc88-tmp-dir\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.444805 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.444783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972771be-01b9-4da1-b895-914fde15bc88-config-volume\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.455150 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.455126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q794t\" (UniqueName: \"kubernetes.io/projected/972771be-01b9-4da1-b895-914fde15bc88-kube-api-access-q794t\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.455235 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.455152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-657rd\" (UniqueName: \"kubernetes.io/projected/73aaed41-fe6b-4446-8ab2-95e11e051d4b-kube-api-access-657rd\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.517375 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.517343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerStarted","Data":"7a24f17293d685b22760aec6b379d6a8f42ac07edd3173c91cb868653f844c62"} Apr 24 23:54:33.947632 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.947555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:33.947632 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.947596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:33.947638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.947713 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.947724 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.947762 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.947765 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:54:34.947751009 +0000 UTC m=+35.228867908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.947825 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:34.947812525 +0000 UTC m=+35.228929422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:54:33.947943 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:33.947837 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:55:05.947829987 +0000 UTC m=+66.228946884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:34.048425 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.048390 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:34.048570 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.048493 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:34.048570 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.048507 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:34.048570 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.048516 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4h97d for pod openshift-network-diagnostics/network-check-target-mthk5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:34.048570 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.048560 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d podName:acf3640a-1870-4ea5-b4cb-f6e0d7abccf0 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:06.048547426 +0000 UTC m=+66.329664324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4h97d" (UniqueName: "kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d") pod "network-check-target-mthk5" (UID: "acf3640a-1870-4ea5-b4cb-f6e0d7abccf0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:34.311235 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.311201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:54:34.311235 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.311232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:54:34.314844 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.314824 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:34.314991 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.314857 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qtxb2\"" Apr 24 23:54:34.314991 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.314861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:34.314991 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.314863 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjkj6\"" Apr 24 23:54:34.314991 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.314858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:34.522166 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.522128 2576 generic.go:358] "Generic (PLEG): container finished" podID="f800603e-9119-44c9-9253-07fb97437cd7" containerID="7a24f17293d685b22760aec6b379d6a8f42ac07edd3173c91cb868653f844c62" exitCode=0 Apr 24 23:54:34.522166 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.522166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerDied","Data":"7a24f17293d685b22760aec6b379d6a8f42ac07edd3173c91cb868653f844c62"} Apr 24 23:54:34.954841 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.954754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:34.955013 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:34.954845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:34.955013 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.954897 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:34.955013 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.954953 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:34.955013 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.954975 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:36.95495924 +0000 UTC m=+37.236076140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:54:34.955013 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:34.954994 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:54:36.954982918 +0000 UTC m=+37.236099815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:54:35.526585 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:35.526552 2576 generic.go:358] "Generic (PLEG): container finished" podID="f800603e-9119-44c9-9253-07fb97437cd7" containerID="d85c1747e97f8bd38885fc564939bc742dc3c95e906436fa0cfe2fcfb84ab0d6" exitCode=0 Apr 24 23:54:35.526954 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:35.526600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerDied","Data":"d85c1747e97f8bd38885fc564939bc742dc3c95e906436fa0cfe2fcfb84ab0d6"} Apr 24 23:54:36.530895 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:36.530859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brnc4" event={"ID":"f800603e-9119-44c9-9253-07fb97437cd7","Type":"ContainerStarted","Data":"9d0896cbfac2f8b71c182d43647e08b4fd5b35529252db6bccf6d11968870c02"} Apr 24 23:54:36.556227 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:36.556179 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-brnc4" podStartSLOduration=6.108880739 podStartE2EDuration="36.556161808s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:54:02.884212042 +0000 UTC m=+3.165328940" lastFinishedPulling="2026-04-24 23:54:33.331493108 +0000 UTC m=+33.612610009" observedRunningTime="2026-04-24 23:54:36.555046395 +0000 UTC m=+36.836163326" watchObservedRunningTime="2026-04-24 23:54:36.556161808 +0000 UTC m=+36.837278729" Apr 24 23:54:36.966591 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:36.966517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:36.966591 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:36.966580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:36.966757 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:36.966661 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:36.966757 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:36.966669 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:36.966757 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:36.966709 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:54:40.966695256 +0000 UTC m=+41.247812153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:54:36.966757 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:36.966727 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:40.966713756 +0000 UTC m=+41.247830653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:54:40.995985 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:40.995783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:40.996418 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:40.996007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:40.996418 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:40.995940 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:40.996418 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:40.996101 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:54:48.996085701 +0000 UTC m=+49.277202600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:54:40.996418 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:40.996151 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:40.996418 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:40.996207 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:48.996189872 +0000 UTC m=+49.277306785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:54:44.013411 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.013376 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn"] Apr 24 23:54:44.016151 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.016132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.019460 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.019442 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:54:44.020119 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.020100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:54:44.020278 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.020261 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9s6bx\"" Apr 24 23:54:44.020483 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.020470 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:54:44.023470 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.023449 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 23:54:44.032076 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.032053 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn"] Apr 24 23:54:44.042495 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.042471 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml"] Apr 24 23:54:44.059405 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.059381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.062460 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.062284 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 23:54:44.062460 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.062412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 23:54:44.062460 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.062415 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 23:54:44.062948 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.062927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 23:54:44.075117 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.075093 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml"] Apr 24 23:54:44.116142 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.116281 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-hub\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.116281 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.116281 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhkb9\" (UniqueName: \"kubernetes.io/projected/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-kube-api-access-qhkb9\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.116281 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-ca\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.116414 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4pvd\" (UniqueName: \"kubernetes.io/projected/b6aa6598-8e40-4911-b6d1-de3287532b48-kube-api-access-q4pvd\") pod \"managed-serviceaccount-addon-agent-8597487dcc-9rfjn\" (UID: \"b6aa6598-8e40-4911-b6d1-de3287532b48\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.116414 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.116414 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.116400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6aa6598-8e40-4911-b6d1-de3287532b48-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8597487dcc-9rfjn\" (UID: \"b6aa6598-8e40-4911-b6d1-de3287532b48\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.217511 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.217511 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-hub\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.217683 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.217683 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhkb9\" (UniqueName: \"kubernetes.io/projected/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-kube-api-access-qhkb9\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.217683 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-ca\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.217683 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4pvd\" (UniqueName: \"kubernetes.io/projected/b6aa6598-8e40-4911-b6d1-de3287532b48-kube-api-access-q4pvd\") pod \"managed-serviceaccount-addon-agent-8597487dcc-9rfjn\" (UID: \"b6aa6598-8e40-4911-b6d1-de3287532b48\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.217844 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.217896 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.217860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6aa6598-8e40-4911-b6d1-de3287532b48-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8597487dcc-9rfjn\" (UID: \"b6aa6598-8e40-4911-b6d1-de3287532b48\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.218455 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.218373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.220666 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.220643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-ca\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.220840 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.220822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-hub\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.220986 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.220970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.221133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.221112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b6aa6598-8e40-4911-b6d1-de3287532b48-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8597487dcc-9rfjn\" (UID: \"b6aa6598-8e40-4911-b6d1-de3287532b48\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.221380 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.221363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.225627 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.225605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhkb9\" (UniqueName: \"kubernetes.io/projected/a8ae5a9e-22ac-4057-baaa-8f48082fd2d3-kube-api-access-qhkb9\") pod \"cluster-proxy-proxy-agent-5f44557f74-jfnml\" (UID: \"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.241910 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.241883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4pvd\" (UniqueName: \"kubernetes.io/projected/b6aa6598-8e40-4911-b6d1-de3287532b48-kube-api-access-q4pvd\") pod \"managed-serviceaccount-addon-agent-8597487dcc-9rfjn\" (UID: \"b6aa6598-8e40-4911-b6d1-de3287532b48\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.337843 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.337826 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" Apr 24 23:54:44.367551 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.367526 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:54:44.490057 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.490034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn"] Apr 24 23:54:44.492546 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:44.492515 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6aa6598_8e40_4911_b6d1_de3287532b48.slice/crio-d7fc7a22f0a8ae4788b3a7695928770031e432c1e3a2b42afc6f3644aa99e7ed WatchSource:0}: Error finding container d7fc7a22f0a8ae4788b3a7695928770031e432c1e3a2b42afc6f3644aa99e7ed: Status 404 returned error can't find the container with id d7fc7a22f0a8ae4788b3a7695928770031e432c1e3a2b42afc6f3644aa99e7ed Apr 24 23:54:44.503028 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.503006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml"] Apr 24 23:54:44.506166 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:54:44.506141 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ae5a9e_22ac_4057_baaa_8f48082fd2d3.slice/crio-c78ad1d2c75b2b5be7e0ca02d9d7655520ab8d26cecbbb706d49b678431ca2dc WatchSource:0}: Error finding container c78ad1d2c75b2b5be7e0ca02d9d7655520ab8d26cecbbb706d49b678431ca2dc: Status 404 returned error can't find the container with id c78ad1d2c75b2b5be7e0ca02d9d7655520ab8d26cecbbb706d49b678431ca2dc Apr 24 23:54:44.546080 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.546044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" event={"ID":"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3","Type":"ContainerStarted","Data":"c78ad1d2c75b2b5be7e0ca02d9d7655520ab8d26cecbbb706d49b678431ca2dc"} Apr 24 23:54:44.546956 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:44.546900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" event={"ID":"b6aa6598-8e40-4911-b6d1-de3287532b48","Type":"ContainerStarted","Data":"d7fc7a22f0a8ae4788b3a7695928770031e432c1e3a2b42afc6f3644aa99e7ed"} Apr 24 23:54:48.556636 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:48.556597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" event={"ID":"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3","Type":"ContainerStarted","Data":"0c48c6d5cd5378077129901f1743730d34274d7048c63c0c93d65b31c6dd190c"} Apr 24 23:54:48.557824 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:48.557794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" event={"ID":"b6aa6598-8e40-4911-b6d1-de3287532b48","Type":"ContainerStarted","Data":"7cc8661ad3dc78b524a155e8e5dc77d4c592411f52d6bb27ba57d48a976226a1"} Apr 24 23:54:48.573745 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:48.573693 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" podStartSLOduration=1.728586822 podStartE2EDuration="5.573678851s" podCreationTimestamp="2026-04-24 23:54:43 +0000 UTC" firstStartedPulling="2026-04-24 23:54:44.494289001 +0000 UTC m=+44.775405904" lastFinishedPulling="2026-04-24 23:54:48.33938102 +0000 UTC m=+48.620497933" observedRunningTime="2026-04-24 23:54:48.57302097 +0000 UTC m=+48.854137887" watchObservedRunningTime="2026-04-24 23:54:48.573678851 +0000 UTC m=+48.854795770" Apr 24 23:54:49.055307 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:49.055270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:54:49.055493 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:49.055325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:54:49.055493 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:49.055410 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:49.055493 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:49.055445 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:49.055493 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:49.055469 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:55:05.055453767 +0000 UTC m=+65.336570670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:54:49.055493 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:54:49.055494 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:05.055478938 +0000 UTC m=+65.336595850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:54:51.566224 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:51.566188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" event={"ID":"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3","Type":"ContainerStarted","Data":"e0fd049b8d5b0ec2bda5aff05bcf836787d3453652149fe656396f2d73ab8249"} Apr 24 23:54:51.566224 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:51.566223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" event={"ID":"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3","Type":"ContainerStarted","Data":"7652ebb191b61f679cbd692bd025ef3ed14c41f7792e68629c06758b7791354b"} Apr 24 23:54:51.587468 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:54:51.587429 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" podStartSLOduration=1.266850942 podStartE2EDuration="7.587416926s" podCreationTimestamp="2026-04-24 23:54:44 +0000 UTC" firstStartedPulling="2026-04-24 23:54:44.507755274 +0000 UTC m=+44.788872172" lastFinishedPulling="2026-04-24 23:54:50.828321258 +0000 UTC m=+51.109438156" observedRunningTime="2026-04-24 23:54:51.586131949 +0000 UTC m=+51.867248881" watchObservedRunningTime="2026-04-24 23:54:51.587416926 +0000 UTC m=+51.868533845" Apr 24 23:55:00.518593 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:00.518565 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27ksn" Apr 24 23:55:05.068259 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:05.068226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:55:05.068634 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:05.068273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:55:05.068634 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:05.068359 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:05.068634 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:05.068374 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:05.068634 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:05.068418 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:37.068404448 +0000 UTC m=+97.349521346 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:55:05.068634 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:05.068431 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:55:37.068425856 +0000 UTC m=+97.349542753 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:55:05.975935 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:05.975874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:55:05.978186 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:05.978168 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:55:05.986096 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:05.986074 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:55:05.986163 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:05.986153 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:56:09.986132141 +0000 UTC m=+130.267249054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : secret "metrics-daemon-secret" not found Apr 24 23:55:06.076609 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.076581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:55:06.078891 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.078873 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:55:06.089607 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.089590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:55:06.100230 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.100210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h97d\" (UniqueName: \"kubernetes.io/projected/acf3640a-1870-4ea5-b4cb-f6e0d7abccf0-kube-api-access-4h97d\") pod \"network-check-target-mthk5\" (UID: \"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0\") " pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:55:06.123898 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.123873 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjkj6\"" Apr 24 23:55:06.132037 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.132017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:55:06.241814 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.241752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mthk5"] Apr 24 23:55:06.244945 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:55:06.244890 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf3640a_1870_4ea5_b4cb_f6e0d7abccf0.slice/crio-0d5122465c4173eadfaefc0a5d1d367a65f42af8ae10ede9b360f26ad1d9c429 WatchSource:0}: Error finding container 0d5122465c4173eadfaefc0a5d1d367a65f42af8ae10ede9b360f26ad1d9c429: Status 404 returned error can't find the container with id 0d5122465c4173eadfaefc0a5d1d367a65f42af8ae10ede9b360f26ad1d9c429 Apr 24 23:55:06.591965 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:06.591931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mthk5" event={"ID":"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0","Type":"ContainerStarted","Data":"0d5122465c4173eadfaefc0a5d1d367a65f42af8ae10ede9b360f26ad1d9c429"} Apr 24 23:55:09.600180 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:09.600061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mthk5" event={"ID":"acf3640a-1870-4ea5-b4cb-f6e0d7abccf0","Type":"ContainerStarted","Data":"8f19b5e3dda24f2b40b4a7a7292b04e1a25880de54081fe633827bd27c22143a"} Apr 24 23:55:09.600559 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:09.600185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:55:09.616426 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:09.616385 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mthk5" podStartSLOduration=67.044290053 podStartE2EDuration="1m9.616373512s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:55:06.246727118 +0000 UTC m=+66.527844016" lastFinishedPulling="2026-04-24 23:55:08.818810577 +0000 UTC m=+69.099927475" observedRunningTime="2026-04-24 23:55:09.616163511 +0000 UTC m=+69.897280430" watchObservedRunningTime="2026-04-24 23:55:09.616373512 +0000 UTC m=+69.897490439" Apr 24 23:55:37.085854 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:37.085795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:55:37.086487 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:37.085878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:55:37.086487 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:37.085985 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:37.086487 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:37.085991 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:37.086487 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:37.086052 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert podName:73aaed41-fe6b-4446-8ab2-95e11e051d4b nodeName:}" failed. No retries permitted until 2026-04-24 23:56:41.086034162 +0000 UTC m=+161.367151073 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert") pod "ingress-canary-6zfbj" (UID: "73aaed41-fe6b-4446-8ab2-95e11e051d4b") : secret "canary-serving-cert" not found Apr 24 23:55:37.086487 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:55:37.086067 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls podName:972771be-01b9-4da1-b895-914fde15bc88 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:41.08605993 +0000 UTC m=+161.367176828 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls") pod "dns-default-fpgn9" (UID: "972771be-01b9-4da1-b895-914fde15bc88") : secret "dns-default-metrics-tls" not found Apr 24 23:55:40.605522 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:55:40.605487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mthk5" Apr 24 23:56:05.091540 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:05.091487 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58s9z_68924e4d-1b30-4887-a8bc-c624385685df/dns-node-resolver/0.log" Apr 24 23:56:05.890407 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:05.890382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6fc98_808b8c86-7996-4c7e-b677-dc648c7c5598/node-ca/0.log" Apr 24 23:56:10.005852 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:10.005818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:56:10.006276 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:10.005994 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:56:10.006276 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:10.006077 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs podName:6206bc2d-d85c-4007-8a04-e9eb243f590c nodeName:}" failed. No retries permitted until 2026-04-24 23:58:12.006059328 +0000 UTC m=+252.287176226 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs") pod "network-metrics-daemon-fdw8f" (UID: "6206bc2d-d85c-4007-8a04-e9eb243f590c") : secret "metrics-daemon-secret" not found Apr 24 23:56:14.523650 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.523611 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c4577ddb7-8svjg"] Apr 24 23:56:14.526452 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.526434 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.528895 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.528878 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:56:14.529123 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.529110 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:56:14.529271 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.529256 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:56:14.529389 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.529374 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tpz6k\"" Apr 24 23:56:14.533707 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.533689 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:56:14.538402 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.538372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c4577ddb7-8svjg"] Apr 24 23:56:14.639939 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.639846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtc27\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-kube-api-access-rtc27\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.639957 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-trusted-ca\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.640009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-registry-certificates\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.640044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-bound-sa-token\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.640114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640313 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.640136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4996022-62e6-4097-9973-46052375a1f9-ca-trust-extracted\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640313 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.640164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-image-registry-private-configuration\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.640313 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.640191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-installation-pull-secrets\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741343 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-registry-certificates\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741343 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-bound-sa-token\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741538 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741538 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4996022-62e6-4097-9973-46052375a1f9-ca-trust-extracted\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741538 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-image-registry-private-configuration\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741538 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-installation-pull-secrets\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741538 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtc27\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-kube-api-access-rtc27\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741773 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-trusted-ca\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.741773 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:14.741611 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:56:14.741773 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:14.741637 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c4577ddb7-8svjg: secret "image-registry-tls" not found Apr 24 23:56:14.741773 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:14.741728 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls podName:a4996022-62e6-4097-9973-46052375a1f9 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:15.241705282 +0000 UTC m=+135.522822180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls") pod "image-registry-c4577ddb7-8svjg" (UID: "a4996022-62e6-4097-9973-46052375a1f9") : secret "image-registry-tls" not found Apr 24 23:56:14.742012 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.741994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-registry-certificates\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.742109 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.742089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4996022-62e6-4097-9973-46052375a1f9-ca-trust-extracted\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.742759 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.742739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-trusted-ca\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.744219 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.744198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-installation-pull-secrets\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.744290 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.744237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-image-registry-private-configuration\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.752291 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.752274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtc27\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-kube-api-access-rtc27\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:14.752395 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:14.752375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-bound-sa-token\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:15.244466 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:15.244434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:15.244633 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:15.244550 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:56:15.244633 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:15.244564 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c4577ddb7-8svjg: secret "image-registry-tls" not found Apr 24 23:56:15.244701 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:15.244641 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls podName:a4996022-62e6-4097-9973-46052375a1f9 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:16.244622873 +0000 UTC m=+136.525739786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls") pod "image-registry-c4577ddb7-8svjg" (UID: "a4996022-62e6-4097-9973-46052375a1f9") : secret "image-registry-tls" not found Apr 24 23:56:16.251130 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.251079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:16.251625 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:16.251243 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:56:16.251625 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:16.251268 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c4577ddb7-8svjg: secret "image-registry-tls" not found Apr 24 23:56:16.251625 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:16.251352 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls podName:a4996022-62e6-4097-9973-46052375a1f9 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:18.25133104 +0000 UTC m=+138.532447952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls") pod "image-registry-c4577ddb7-8svjg" (UID: "a4996022-62e6-4097-9973-46052375a1f9") : secret "image-registry-tls" not found Apr 24 23:56:16.761133 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.761096 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bj6s9"] Apr 24 23:56:16.765143 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.765121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.775783 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.775756 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:56:16.775943 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.775926 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:56:16.775998 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.775983 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:56:16.775998 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.775927 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-l8vsl\"" Apr 24 23:56:16.776706 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.776691 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:56:16.782745 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.782720 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bj6s9"] Apr 24 23:56:16.856106 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.856078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzjzt\" (UniqueName: \"kubernetes.io/projected/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-kube-api-access-wzjzt\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.856210 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.856139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-crio-socket\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.856250 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.856202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-data-volume\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.856250 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.856228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.856316 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.856262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957020 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.956999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzjzt\" (UniqueName: \"kubernetes.io/projected/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-kube-api-access-wzjzt\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957111 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-crio-socket\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957111 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-data-volume\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957174 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957174 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957244 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-crio-socket\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957276 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:16.957243 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:16.957317 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:16.957307 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls podName:21a2be45-b5f0-4fe5-aabf-c7b8783b56af nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.457285839 +0000 UTC m=+137.738402755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bj6s9" (UID: "21a2be45-b5f0-4fe5-aabf-c7b8783b56af") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:16.957509 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-data-volume\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.957697 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.957679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:16.969524 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:16.969499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzjzt\" (UniqueName: \"kubernetes.io/projected/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-kube-api-access-wzjzt\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:17.461073 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:17.461033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:17.461446 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:17.461199 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:17.461446 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:17.461261 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls podName:21a2be45-b5f0-4fe5-aabf-c7b8783b56af nodeName:}" failed. No retries permitted until 2026-04-24 23:56:18.461245833 +0000 UTC m=+138.742362731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bj6s9" (UID: "21a2be45-b5f0-4fe5-aabf-c7b8783b56af") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:18.267078 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:18.267032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:18.267241 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:18.267153 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:56:18.267241 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:18.267164 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c4577ddb7-8svjg: secret "image-registry-tls" not found Apr 24 23:56:18.267241 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:18.267213 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls podName:a4996022-62e6-4097-9973-46052375a1f9 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:22.267198597 +0000 UTC m=+142.548315495 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls") pod "image-registry-c4577ddb7-8svjg" (UID: "a4996022-62e6-4097-9973-46052375a1f9") : secret "image-registry-tls" not found Apr 24 23:56:18.468633 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:18.468604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:18.468986 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:18.468697 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:18.468986 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:18.468753 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls podName:21a2be45-b5f0-4fe5-aabf-c7b8783b56af nodeName:}" failed. No retries permitted until 2026-04-24 23:56:20.468740216 +0000 UTC m=+140.749857115 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bj6s9" (UID: "21a2be45-b5f0-4fe5-aabf-c7b8783b56af") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:20.483144 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:20.483096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:20.483684 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:20.483225 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:20.483684 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:20.483292 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls podName:21a2be45-b5f0-4fe5-aabf-c7b8783b56af nodeName:}" failed. No retries permitted until 2026-04-24 23:56:24.483269461 +0000 UTC m=+144.764386362 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bj6s9" (UID: "21a2be45-b5f0-4fe5-aabf-c7b8783b56af") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:22.297786 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:22.297749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:22.298174 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:22.297892 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:56:22.298174 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:22.297938 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c4577ddb7-8svjg: secret "image-registry-tls" not found Apr 24 23:56:22.298174 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:22.297996 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls podName:a4996022-62e6-4097-9973-46052375a1f9 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:30.297981021 +0000 UTC m=+150.579097919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls") pod "image-registry-c4577ddb7-8svjg" (UID: "a4996022-62e6-4097-9973-46052375a1f9") : secret "image-registry-tls" not found Apr 24 23:56:24.369242 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:24.369189 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" podUID="a8ae5a9e-22ac-4057-baaa-8f48082fd2d3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 23:56:24.514860 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:24.514827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:24.515004 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:24.514995 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 23:56:24.515066 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:24.515055 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls podName:21a2be45-b5f0-4fe5-aabf-c7b8783b56af nodeName:}" failed. No retries permitted until 2026-04-24 23:56:32.515037883 +0000 UTC m=+152.796154785 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bj6s9" (UID: "21a2be45-b5f0-4fe5-aabf-c7b8783b56af") : secret "insights-runtime-extractor-tls" not found Apr 24 23:56:30.358698 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.358665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:30.361948 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.361928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"image-registry-c4577ddb7-8svjg\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:30.434793 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.434761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:30.563746 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.563722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c4577ddb7-8svjg"] Apr 24 23:56:30.566239 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:56:30.566205 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4996022_62e6_4097_9973_46052375a1f9.slice/crio-3957489bfdf02c63fafdfcd94169d6041fb596e6d8f0d03d473c5c799d61a782 WatchSource:0}: Error finding container 3957489bfdf02c63fafdfcd94169d6041fb596e6d8f0d03d473c5c799d61a782: Status 404 returned error can't find the container with id 3957489bfdf02c63fafdfcd94169d6041fb596e6d8f0d03d473c5c799d61a782 Apr 24 23:56:30.781007 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.780973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" event={"ID":"a4996022-62e6-4097-9973-46052375a1f9","Type":"ContainerStarted","Data":"c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136"} Apr 24 23:56:30.781007 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.781009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" event={"ID":"a4996022-62e6-4097-9973-46052375a1f9","Type":"ContainerStarted","Data":"3957489bfdf02c63fafdfcd94169d6041fb596e6d8f0d03d473c5c799d61a782"} Apr 24 23:56:30.781332 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.781105 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:56:30.805078 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:30.805035 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" podStartSLOduration=16.805018733 podStartE2EDuration="16.805018733s" podCreationTimestamp="2026-04-24 23:56:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:30.803492283 +0000 UTC m=+151.084609216" watchObservedRunningTime="2026-04-24 23:56:30.805018733 +0000 UTC m=+151.086135654" Apr 24 23:56:32.575329 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:32.575290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:32.577767 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:32.577745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/21a2be45-b5f0-4fe5-aabf-c7b8783b56af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bj6s9\" (UID: \"21a2be45-b5f0-4fe5-aabf-c7b8783b56af\") " pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:32.674008 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:32.673983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bj6s9" Apr 24 23:56:32.791332 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:32.791313 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bj6s9"] Apr 24 23:56:32.793793 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:56:32.793766 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21a2be45_b5f0_4fe5_aabf_c7b8783b56af.slice/crio-a9cd6503fc3981e9cebb7087f24846a08e090186b00b930253f3721cb00e6b16 WatchSource:0}: Error finding container a9cd6503fc3981e9cebb7087f24846a08e090186b00b930253f3721cb00e6b16: Status 404 returned error can't find the container with id a9cd6503fc3981e9cebb7087f24846a08e090186b00b930253f3721cb00e6b16 Apr 24 23:56:33.790266 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:33.790233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bj6s9" event={"ID":"21a2be45-b5f0-4fe5-aabf-c7b8783b56af","Type":"ContainerStarted","Data":"b09afe2b69c54086ccf8f26e50a67347458391a417dc2e73bd5baf26ecb414e8"} Apr 24 23:56:33.790266 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:33.790267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bj6s9" event={"ID":"21a2be45-b5f0-4fe5-aabf-c7b8783b56af","Type":"ContainerStarted","Data":"73af023f979ef2ed6e93252d22cfd9d316f86861f3789de433354c948a851908"} Apr 24 23:56:33.790646 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:33.790281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bj6s9" event={"ID":"21a2be45-b5f0-4fe5-aabf-c7b8783b56af","Type":"ContainerStarted","Data":"a9cd6503fc3981e9cebb7087f24846a08e090186b00b930253f3721cb00e6b16"} Apr 24 23:56:34.369139 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:34.369081 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" podUID="a8ae5a9e-22ac-4057-baaa-8f48082fd2d3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 23:56:34.794285 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:34.794247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bj6s9" event={"ID":"21a2be45-b5f0-4fe5-aabf-c7b8783b56af","Type":"ContainerStarted","Data":"fe450da61b09300da2553a4103c0688842a7ac5c590494087386b99c33158af9"} Apr 24 23:56:34.812691 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:34.812648 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bj6s9" podStartSLOduration=16.951032162 podStartE2EDuration="18.812634895s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:32.848183122 +0000 UTC m=+153.129300021" lastFinishedPulling="2026-04-24 23:56:34.709785854 +0000 UTC m=+154.990902754" observedRunningTime="2026-04-24 23:56:34.812114175 +0000 UTC m=+155.093231096" watchObservedRunningTime="2026-04-24 23:56:34.812634895 +0000 UTC m=+155.093751814" Apr 24 23:56:36.163019 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:36.162965 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fpgn9" podUID="972771be-01b9-4da1-b895-914fde15bc88" Apr 24 23:56:36.178105 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:36.178076 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6zfbj" podUID="73aaed41-fe6b-4446-8ab2-95e11e051d4b" Apr 24 23:56:36.799348 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:36.799316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpgn9" Apr 24 23:56:37.326413 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:37.326374 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fdw8f" podUID="6206bc2d-d85c-4007-8a04-e9eb243f590c" Apr 24 23:56:40.205242 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:40.205165 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c4577ddb7-8svjg"] Apr 24 23:56:41.135049 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.135002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:56:41.135290 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.135067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:56:41.137540 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.137517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/73aaed41-fe6b-4446-8ab2-95e11e051d4b-cert\") pod \"ingress-canary-6zfbj\" (UID: \"73aaed41-fe6b-4446-8ab2-95e11e051d4b\") " pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:56:41.137644 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.137551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/972771be-01b9-4da1-b895-914fde15bc88-metrics-tls\") pod \"dns-default-fpgn9\" (UID: \"972771be-01b9-4da1-b895-914fde15bc88\") " pod="openshift-dns/dns-default-fpgn9" Apr 24 23:56:41.302856 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.302829 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-drk7t\"" Apr 24 23:56:41.310806 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.310789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpgn9" Apr 24 23:56:41.424019 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.423953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fpgn9"] Apr 24 23:56:41.427246 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:56:41.427223 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod972771be_01b9_4da1_b895_914fde15bc88.slice/crio-a5785a36707f9c96a584d8a8264ba8ff96e8ef22afccde068aea53fcc782a753 WatchSource:0}: Error finding container a5785a36707f9c96a584d8a8264ba8ff96e8ef22afccde068aea53fcc782a753: Status 404 returned error can't find the container with id a5785a36707f9c96a584d8a8264ba8ff96e8ef22afccde068aea53fcc782a753 Apr 24 23:56:41.813025 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:41.812985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpgn9" event={"ID":"972771be-01b9-4da1-b895-914fde15bc88","Type":"ContainerStarted","Data":"a5785a36707f9c96a584d8a8264ba8ff96e8ef22afccde068aea53fcc782a753"} Apr 24 23:56:42.817336 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:42.817262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpgn9" event={"ID":"972771be-01b9-4da1-b895-914fde15bc88","Type":"ContainerStarted","Data":"4cc6d8798b40a5818b4216d2fde162b9c655dbcb089023a17201cb7716df8b14"} Apr 24 23:56:42.817336 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:42.817301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpgn9" event={"ID":"972771be-01b9-4da1-b895-914fde15bc88","Type":"ContainerStarted","Data":"726f08ad25fdc68863420df289783cdecb51732385c6583a03587cecac7eae71"} Apr 24 23:56:42.817676 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:42.817427 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fpgn9" Apr 24 23:56:42.834331 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:42.834285 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fpgn9" podStartSLOduration=128.707788375 podStartE2EDuration="2m9.834272592s" podCreationTimestamp="2026-04-24 23:54:33 +0000 UTC" firstStartedPulling="2026-04-24 23:56:41.429036515 +0000 UTC m=+161.710153413" lastFinishedPulling="2026-04-24 23:56:42.555520725 +0000 UTC m=+162.836637630" observedRunningTime="2026-04-24 23:56:42.833379693 +0000 UTC m=+163.114496614" watchObservedRunningTime="2026-04-24 23:56:42.834272592 +0000 UTC m=+163.115389512" Apr 24 23:56:44.368428 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.368387 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" podUID="a8ae5a9e-22ac-4057-baaa-8f48082fd2d3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 23:56:44.368857 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.368452 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" Apr 24 23:56:44.368937 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.368846 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e0fd049b8d5b0ec2bda5aff05bcf836787d3453652149fe656396f2d73ab8249"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 23:56:44.368937 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.368895 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" podUID="a8ae5a9e-22ac-4057-baaa-8f48082fd2d3" containerName="service-proxy" containerID="cri-o://e0fd049b8d5b0ec2bda5aff05bcf836787d3453652149fe656396f2d73ab8249" gracePeriod=30 Apr 24 23:56:44.825498 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.825466 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8ae5a9e-22ac-4057-baaa-8f48082fd2d3" containerID="e0fd049b8d5b0ec2bda5aff05bcf836787d3453652149fe656396f2d73ab8249" exitCode=2 Apr 24 23:56:44.825498 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.825508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" event={"ID":"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3","Type":"ContainerDied","Data":"e0fd049b8d5b0ec2bda5aff05bcf836787d3453652149fe656396f2d73ab8249"} Apr 24 23:56:44.825746 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:44.825543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f44557f74-jfnml" event={"ID":"a8ae5a9e-22ac-4057-baaa-8f48082fd2d3","Type":"ContainerStarted","Data":"beef3fbb00a6fb77fb2e1593e09fe5cfb2a50be975aa197942960cac4b52a605"} Apr 24 23:56:47.765315 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.765281 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cq4m6"] Apr 24 23:56:47.768427 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.768401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.769450 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.769426 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-569p9"] Apr 24 23:56:47.770898 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.770880 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:56:47.771147 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.771131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vjmll\"" Apr 24 23:56:47.771392 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.771378 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:56:47.772126 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.772105 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 23:56:47.772208 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.772136 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 23:56:47.772273 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.772209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:56:47.772273 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.772222 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 23:56:47.772370 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.772315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.774443 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.774414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:56:47.774568 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.774433 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:56:47.774669 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.774652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7cj9k\"" Apr 24 23:56:47.774750 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.774662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:56:47.779175 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.779158 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cq4m6"] Apr 24 23:56:47.881684 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-tls\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.881684 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12c65432-33ba-4594-b447-d3f8ad398777-metrics-client-ca\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.881886 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.881886 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-sys\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.881886 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-wtmp\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.882030 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d64n4\" (UniqueName: \"kubernetes.io/projected/12c65432-33ba-4594-b447-d3f8ad398777-kube-api-access-d64n4\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.882030 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-root\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.882030 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.881994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2np\" (UniqueName: \"kubernetes.io/projected/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-api-access-xm2np\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.882030 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.882177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.882177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3341466c-0b0e-499e-8a69-b4a033f0e495-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.882177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-accelerators-collector-config\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.882177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.882177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-textfile\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.882177 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.882170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3341466c-0b0e-499e-8a69-b4a033f0e495-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.982681 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.982845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.982845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3341466c-0b0e-499e-8a69-b4a033f0e495-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.982845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-accelerators-collector-config\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.982845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.982845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-textfile\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.982845 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3341466c-0b0e-499e-8a69-b4a033f0e495-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-tls\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12c65432-33ba-4594-b447-d3f8ad398777-metrics-client-ca\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-sys\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.982994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-wtmp\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:47.983018 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:56:47.983192 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:56:47.983119 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-tls podName:12c65432-33ba-4594-b447-d3f8ad398777 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:48.483095502 +0000 UTC m=+168.764212413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-tls") pod "node-exporter-569p9" (UID: "12c65432-33ba-4594-b447-d3f8ad398777") : secret "node-exporter-tls" not found Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3341466c-0b0e-499e-8a69-b4a033f0e495-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d64n4\" (UniqueName: \"kubernetes.io/projected/12c65432-33ba-4594-b447-d3f8ad398777-kube-api-access-d64n4\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-root\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-textfile\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2np\" (UniqueName: \"kubernetes.io/projected/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-api-access-xm2np\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-root\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983529 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-sys\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983805 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983532 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-accelerators-collector-config\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983805 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.983805 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-wtmp\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.983906 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.983836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3341466c-0b0e-499e-8a69-b4a033f0e495-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.984030 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.984005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12c65432-33ba-4594-b447-d3f8ad398777-metrics-client-ca\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.985276 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.985249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.986040 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.986019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.986202 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.986186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:47.992154 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.992135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d64n4\" (UniqueName: \"kubernetes.io/projected/12c65432-33ba-4594-b447-d3f8ad398777-kube-api-access-d64n4\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:47.992620 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:47.992598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2np\" (UniqueName: \"kubernetes.io/projected/3341466c-0b0e-499e-8a69-b4a033f0e495-kube-api-access-xm2np\") pod \"kube-state-metrics-69db897b98-cq4m6\" (UID: \"3341466c-0b0e-499e-8a69-b4a033f0e495\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:48.078943 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.078863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" Apr 24 23:56:48.193761 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.193727 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cq4m6"] Apr 24 23:56:48.197120 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:56:48.197098 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3341466c_0b0e_499e_8a69_b4a033f0e495.slice/crio-3755035c9bd1e8b35acfe20e76cb681fa068c1905e964952c12faf582d3ea132 WatchSource:0}: Error finding container 3755035c9bd1e8b35acfe20e76cb681fa068c1905e964952c12faf582d3ea132: Status 404 returned error can't find the container with id 3755035c9bd1e8b35acfe20e76cb681fa068c1905e964952c12faf582d3ea132 Apr 24 23:56:48.487479 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.487402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-tls\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:48.489711 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.489695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/12c65432-33ba-4594-b447-d3f8ad398777-node-exporter-tls\") pod \"node-exporter-569p9\" (UID: \"12c65432-33ba-4594-b447-d3f8ad398777\") " pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:48.684422 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.684390 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-569p9" Apr 24 23:56:48.693465 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:56:48.693434 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c65432_33ba_4594_b447_d3f8ad398777.slice/crio-d0476d77f2b95c9d6166dd6bc4dd7c177258f01106bfb33d072a294d97388b42 WatchSource:0}: Error finding container d0476d77f2b95c9d6166dd6bc4dd7c177258f01106bfb33d072a294d97388b42: Status 404 returned error can't find the container with id d0476d77f2b95c9d6166dd6bc4dd7c177258f01106bfb33d072a294d97388b42 Apr 24 23:56:48.838163 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.838119 2576 generic.go:358] "Generic (PLEG): container finished" podID="b6aa6598-8e40-4911-b6d1-de3287532b48" containerID="7cc8661ad3dc78b524a155e8e5dc77d4c592411f52d6bb27ba57d48a976226a1" exitCode=255 Apr 24 23:56:48.838523 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.838243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" event={"ID":"b6aa6598-8e40-4911-b6d1-de3287532b48","Type":"ContainerDied","Data":"7cc8661ad3dc78b524a155e8e5dc77d4c592411f52d6bb27ba57d48a976226a1"} Apr 24 23:56:48.838743 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.838720 2576 scope.go:117] "RemoveContainer" containerID="7cc8661ad3dc78b524a155e8e5dc77d4c592411f52d6bb27ba57d48a976226a1" Apr 24 23:56:48.842941 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.840804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-569p9" event={"ID":"12c65432-33ba-4594-b447-d3f8ad398777","Type":"ContainerStarted","Data":"d0476d77f2b95c9d6166dd6bc4dd7c177258f01106bfb33d072a294d97388b42"} Apr 24 23:56:48.844425 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:48.844390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" event={"ID":"3341466c-0b0e-499e-8a69-b4a033f0e495","Type":"ContainerStarted","Data":"3755035c9bd1e8b35acfe20e76cb681fa068c1905e964952c12faf582d3ea132"} Apr 24 23:56:49.311518 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.311400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:56:49.311518 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.311405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:56:49.314126 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.314110 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gqd2q\"" Apr 24 23:56:49.322284 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.322077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6zfbj" Apr 24 23:56:49.496344 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.496300 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6zfbj"] Apr 24 23:56:49.499682 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:56:49.499619 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73aaed41_fe6b_4446_8ab2_95e11e051d4b.slice/crio-97da8d1c9cae650f60b7f5e9c10ec2d9572762a8ac554c7e73f8f1ad4a832694 WatchSource:0}: Error finding container 97da8d1c9cae650f60b7f5e9c10ec2d9572762a8ac554c7e73f8f1ad4a832694: Status 404 returned error can't find the container with id 97da8d1c9cae650f60b7f5e9c10ec2d9572762a8ac554c7e73f8f1ad4a832694 Apr 24 23:56:49.847831 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.847800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6zfbj" event={"ID":"73aaed41-fe6b-4446-8ab2-95e11e051d4b","Type":"ContainerStarted","Data":"97da8d1c9cae650f60b7f5e9c10ec2d9572762a8ac554c7e73f8f1ad4a832694"} Apr 24 23:56:49.849436 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.849410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8597487dcc-9rfjn" event={"ID":"b6aa6598-8e40-4911-b6d1-de3287532b48","Type":"ContainerStarted","Data":"5d86d2ca4309c975adbcc6edf711145afd3041725649280a4f3957562ef90011"} Apr 24 23:56:49.851034 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.850995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-569p9" event={"ID":"12c65432-33ba-4594-b447-d3f8ad398777","Type":"ContainerStarted","Data":"4b32da83987c2ee38b6bd0364700aae8b48353a3b11d8dd74a448053cc0611cd"} Apr 24 23:56:49.853116 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.853093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" event={"ID":"3341466c-0b0e-499e-8a69-b4a033f0e495","Type":"ContainerStarted","Data":"7a3e4fbadf863b0c6e9a6b66bff3414bc991fe6c92499daf04e4d74ed2978ff7"} Apr 24 23:56:49.853116 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.853121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" event={"ID":"3341466c-0b0e-499e-8a69-b4a033f0e495","Type":"ContainerStarted","Data":"f095dbc688e864ebf2487a41cdc6c628f34a1baafa7a05d0d78144d6c5d86b03"} Apr 24 23:56:49.853246 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.853137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" event={"ID":"3341466c-0b0e-499e-8a69-b4a033f0e495","Type":"ContainerStarted","Data":"c84a5038b13fee25907f7d62f715db1899fd898231b63ab6b78eaca9ec8dda74"} Apr 24 23:56:49.882621 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:49.882572 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-cq4m6" podStartSLOduration=1.844525996 podStartE2EDuration="2.88255666s" podCreationTimestamp="2026-04-24 23:56:47 +0000 UTC" firstStartedPulling="2026-04-24 23:56:48.198806008 +0000 UTC m=+168.479922905" lastFinishedPulling="2026-04-24 23:56:49.236836672 +0000 UTC m=+169.517953569" observedRunningTime="2026-04-24 23:56:49.881708109 +0000 UTC m=+170.162825031" watchObservedRunningTime="2026-04-24 23:56:49.88255666 +0000 UTC m=+170.163673577" Apr 24 23:56:50.212044 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:50.212008 2576 patch_prober.go:28] interesting pod/image-registry-c4577ddb7-8svjg container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:56:50.212260 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:50.212063 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" podUID="a4996022-62e6-4097-9973-46052375a1f9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:56:50.857392 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:50.857350 2576 generic.go:358] "Generic (PLEG): container finished" podID="12c65432-33ba-4594-b447-d3f8ad398777" containerID="4b32da83987c2ee38b6bd0364700aae8b48353a3b11d8dd74a448053cc0611cd" exitCode=0 Apr 24 23:56:50.857848 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:50.857453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-569p9" event={"ID":"12c65432-33ba-4594-b447-d3f8ad398777","Type":"ContainerDied","Data":"4b32da83987c2ee38b6bd0364700aae8b48353a3b11d8dd74a448053cc0611cd"} Apr 24 23:56:51.861537 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:51.861493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6zfbj" event={"ID":"73aaed41-fe6b-4446-8ab2-95e11e051d4b","Type":"ContainerStarted","Data":"25fd99cae1aabfb4f9c4d9b5adb440ea1b24b83e435c657f28673f87eb698639"} Apr 24 23:56:51.863280 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:51.863255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-569p9" event={"ID":"12c65432-33ba-4594-b447-d3f8ad398777","Type":"ContainerStarted","Data":"1dfc1a113cf8a81871f211930cebf63d88c2fa3df073c39031c17377ad388655"} Apr 24 23:56:51.863383 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:51.863285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-569p9" event={"ID":"12c65432-33ba-4594-b447-d3f8ad398777","Type":"ContainerStarted","Data":"ffeee888a2120d841acb8332d1f37aefd8ac8c29145dfed5da452ded0608ae26"} Apr 24 23:56:51.878333 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:51.878288 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6zfbj" podStartSLOduration=137.409637672 podStartE2EDuration="2m18.878277386s" podCreationTimestamp="2026-04-24 23:54:33 +0000 UTC" firstStartedPulling="2026-04-24 23:56:49.50133351 +0000 UTC m=+169.782450422" lastFinishedPulling="2026-04-24 23:56:50.969973235 +0000 UTC m=+171.251090136" observedRunningTime="2026-04-24 23:56:51.877461529 +0000 UTC m=+172.158578446" watchObservedRunningTime="2026-04-24 23:56:51.878277386 +0000 UTC m=+172.159394305" Apr 24 23:56:51.894704 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:51.894664 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-569p9" podStartSLOduration=3.8764021939999997 podStartE2EDuration="4.894652504s" podCreationTimestamp="2026-04-24 23:56:47 +0000 UTC" firstStartedPulling="2026-04-24 23:56:48.695233356 +0000 UTC m=+168.976350253" lastFinishedPulling="2026-04-24 23:56:49.713483663 +0000 UTC m=+169.994600563" observedRunningTime="2026-04-24 23:56:51.894444184 +0000 UTC m=+172.175561104" watchObservedRunningTime="2026-04-24 23:56:51.894652504 +0000 UTC m=+172.175769401" Apr 24 23:56:52.823557 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:56:52.823529 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fpgn9" Apr 24 23:57:00.209314 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:00.209283 2576 patch_prober.go:28] interesting pod/image-registry-c4577ddb7-8svjg container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:57:00.209653 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:00.209329 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" podUID="a4996022-62e6-4097-9973-46052375a1f9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:57:05.224092 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.224046 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" podUID="a4996022-62e6-4097-9973-46052375a1f9" containerName="registry" containerID="cri-o://c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136" gracePeriod=30 Apr 24 23:57:05.458342 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.458320 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:57:05.519680 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-registry-certificates\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.519825 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519694 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.519825 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519736 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtc27\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-kube-api-access-rtc27\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.519825 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519772 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-bound-sa-token\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.519825 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519804 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4996022-62e6-4097-9973-46052375a1f9-ca-trust-extracted\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.520006 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519843 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-trusted-ca\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.520006 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519878 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-installation-pull-secrets\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.520006 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.519909 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-image-registry-private-configuration\") pod \"a4996022-62e6-4097-9973-46052375a1f9\" (UID: \"a4996022-62e6-4097-9973-46052375a1f9\") " Apr 24 23:57:05.520294 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.520264 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:05.520449 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.520422 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:05.522495 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.522464 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:05.522637 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.522609 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-kube-api-access-rtc27" (OuterVolumeSpecName: "kube-api-access-rtc27") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "kube-api-access-rtc27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:05.522754 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.522679 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:05.522812 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.522755 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:05.522867 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.522830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:05.528742 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.528717 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4996022-62e6-4097-9973-46052375a1f9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a4996022-62e6-4097-9973-46052375a1f9" (UID: "a4996022-62e6-4097-9973-46052375a1f9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620685 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-registry-certificates\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620724 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-registry-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620738 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtc27\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-kube-api-access-rtc27\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620770 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4996022-62e6-4097-9973-46052375a1f9-bound-sa-token\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620783 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4996022-62e6-4097-9973-46052375a1f9-ca-trust-extracted\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620797 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4996022-62e6-4097-9973-46052375a1f9-trusted-ca\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620811 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-installation-pull-secrets\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.620876 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.620839 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a4996022-62e6-4097-9973-46052375a1f9-image-registry-private-configuration\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 24 23:57:05.904442 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.904352 2576 generic.go:358] "Generic (PLEG): container finished" podID="a4996022-62e6-4097-9973-46052375a1f9" containerID="c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136" exitCode=0 Apr 24 23:57:05.904442 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.904410 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" Apr 24 23:57:05.904627 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.904411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" event={"ID":"a4996022-62e6-4097-9973-46052375a1f9","Type":"ContainerDied","Data":"c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136"} Apr 24 23:57:05.904627 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.904526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c4577ddb7-8svjg" event={"ID":"a4996022-62e6-4097-9973-46052375a1f9","Type":"ContainerDied","Data":"3957489bfdf02c63fafdfcd94169d6041fb596e6d8f0d03d473c5c799d61a782"} Apr 24 23:57:05.904627 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.904549 2576 scope.go:117] "RemoveContainer" containerID="c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136" Apr 24 23:57:05.912852 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.912835 2576 scope.go:117] "RemoveContainer" containerID="c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136" Apr 24 23:57:05.913133 ip-10-0-129-109 kubenswrapper[2576]: E0424 23:57:05.913112 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136\": container with ID starting with c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136 not found: ID does not exist" containerID="c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136" Apr 24 23:57:05.913202 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.913140 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136"} err="failed to get container status \"c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136\": rpc error: code = NotFound desc = could not find container \"c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136\": container with ID starting with c045b73044ab880d9d58d9d7eaf6de691ef02dbbd95c81a9380856810c252136 not found: ID does not exist" Apr 24 23:57:05.924108 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.924083 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c4577ddb7-8svjg"] Apr 24 23:57:05.927713 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:05.927693 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c4577ddb7-8svjg"] Apr 24 23:57:06.315258 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:06.315231 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4996022-62e6-4097-9973-46052375a1f9" path="/var/lib/kubelet/pods/a4996022-62e6-4097-9973-46052375a1f9/volumes" Apr 24 23:57:34.144929 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:34.144883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fpgn9_972771be-01b9-4da1-b895-914fde15bc88/dns/0.log" Apr 24 23:57:34.150570 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:34.150553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fpgn9_972771be-01b9-4da1-b895-914fde15bc88/kube-rbac-proxy/0.log" Apr 24 23:57:34.313882 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:57:34.313855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58s9z_68924e4d-1b30-4887-a8bc-c624385685df/dns-node-resolver/0.log" Apr 24 23:58:12.083138 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:12.083062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:58:12.085410 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:12.085392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6206bc2d-d85c-4007-8a04-e9eb243f590c-metrics-certs\") pod \"network-metrics-daemon-fdw8f\" (UID: \"6206bc2d-d85c-4007-8a04-e9eb243f590c\") " pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:58:12.114770 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:12.114748 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qtxb2\"" Apr 24 23:58:12.122557 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:12.122541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fdw8f" Apr 24 23:58:12.232350 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:12.232321 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fdw8f"] Apr 24 23:58:12.235703 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:58:12.235680 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6206bc2d_d85c_4007_8a04_e9eb243f590c.slice/crio-249ee7ab513a43fcc2f8320574f44de214cdc152d666b38617746463d6ac4207 WatchSource:0}: Error finding container 249ee7ab513a43fcc2f8320574f44de214cdc152d666b38617746463d6ac4207: Status 404 returned error can't find the container with id 249ee7ab513a43fcc2f8320574f44de214cdc152d666b38617746463d6ac4207 Apr 24 23:58:13.067116 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:13.067075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fdw8f" event={"ID":"6206bc2d-d85c-4007-8a04-e9eb243f590c","Type":"ContainerStarted","Data":"249ee7ab513a43fcc2f8320574f44de214cdc152d666b38617746463d6ac4207"} Apr 24 23:58:14.071305 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:14.071270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fdw8f" event={"ID":"6206bc2d-d85c-4007-8a04-e9eb243f590c","Type":"ContainerStarted","Data":"d0f71a56baebed016ed52237a7cb7a5a286e4f8cea97678071df7a9379350f29"} Apr 24 23:58:14.071305 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:14.071304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fdw8f" event={"ID":"6206bc2d-d85c-4007-8a04-e9eb243f590c","Type":"ContainerStarted","Data":"cc81869d29ed628f88eb04c3fa3eba1ab2548eacf0418d923b71945a65cc0d3a"} Apr 24 23:58:14.087457 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:58:14.087409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fdw8f" podStartSLOduration=253.14415446 podStartE2EDuration="4m14.087394348s" podCreationTimestamp="2026-04-24 23:54:00 +0000 UTC" firstStartedPulling="2026-04-24 23:58:12.237473152 +0000 UTC m=+252.518590051" lastFinishedPulling="2026-04-24 23:58:13.180713038 +0000 UTC m=+253.461829939" observedRunningTime="2026-04-24 23:58:14.086221283 +0000 UTC m=+254.367338198" watchObservedRunningTime="2026-04-24 23:58:14.087394348 +0000 UTC m=+254.368511265" Apr 24 23:59:00.189130 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:00.189100 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:59:00.189636 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:00.189264 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 24 23:59:00.195595 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:00.195572 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:59:50.844517 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.844484 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cgk2q"] Apr 24 23:59:50.845007 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.844697 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4996022-62e6-4097-9973-46052375a1f9" containerName="registry" Apr 24 23:59:50.845007 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.844708 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4996022-62e6-4097-9973-46052375a1f9" containerName="registry" Apr 24 23:59:50.845007 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.844754 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4996022-62e6-4097-9973-46052375a1f9" containerName="registry" Apr 24 23:59:50.847353 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.847332 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:50.849867 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.849846 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:59:50.856289 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.856249 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cgk2q"] Apr 24 23:59:50.921257 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.921232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/301e8349-12fd-4785-95ed-4b2e9b42b9a6-kubelet-config\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:50.921388 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.921261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/301e8349-12fd-4785-95ed-4b2e9b42b9a6-dbus\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:50.921388 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:50.921291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/301e8349-12fd-4785-95ed-4b2e9b42b9a6-original-pull-secret\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.022231 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.022204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/301e8349-12fd-4785-95ed-4b2e9b42b9a6-original-pull-secret\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.022384 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.022267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/301e8349-12fd-4785-95ed-4b2e9b42b9a6-kubelet-config\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.022384 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.022284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/301e8349-12fd-4785-95ed-4b2e9b42b9a6-dbus\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.022454 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.022377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/301e8349-12fd-4785-95ed-4b2e9b42b9a6-kubelet-config\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.022454 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.022393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/301e8349-12fd-4785-95ed-4b2e9b42b9a6-dbus\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.024585 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.024567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/301e8349-12fd-4785-95ed-4b2e9b42b9a6-original-pull-secret\") pod \"global-pull-secret-syncer-cgk2q\" (UID: \"301e8349-12fd-4785-95ed-4b2e9b42b9a6\") " pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.157093 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.157005 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cgk2q" Apr 24 23:59:51.271122 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.271092 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cgk2q"] Apr 24 23:59:51.274173 ip-10-0-129-109 kubenswrapper[2576]: W0424 23:59:51.274142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301e8349_12fd_4785_95ed_4b2e9b42b9a6.slice/crio-d23c597748d3dadf191465a9dae8009a373d2440d6b4edddca07c3931fa24bee WatchSource:0}: Error finding container d23c597748d3dadf191465a9dae8009a373d2440d6b4edddca07c3931fa24bee: Status 404 returned error can't find the container with id d23c597748d3dadf191465a9dae8009a373d2440d6b4edddca07c3931fa24bee Apr 24 23:59:51.276051 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.276031 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:59:51.323837 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:51.323809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cgk2q" event={"ID":"301e8349-12fd-4785-95ed-4b2e9b42b9a6","Type":"ContainerStarted","Data":"d23c597748d3dadf191465a9dae8009a373d2440d6b4edddca07c3931fa24bee"} Apr 24 23:59:55.334417 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:55.334388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cgk2q" event={"ID":"301e8349-12fd-4785-95ed-4b2e9b42b9a6","Type":"ContainerStarted","Data":"15b549728d35a007ac69f9803a7d98271659e713f64e59352046065bd4634f2e"} Apr 24 23:59:55.349609 ip-10-0-129-109 kubenswrapper[2576]: I0424 23:59:55.349564 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cgk2q" podStartSLOduration=1.752007213 podStartE2EDuration="5.349550104s" podCreationTimestamp="2026-04-24 23:59:50 +0000 UTC" firstStartedPulling="2026-04-24 23:59:51.276162947 +0000 UTC m=+351.557279858" lastFinishedPulling="2026-04-24 23:59:54.873705847 +0000 UTC m=+355.154822749" observedRunningTime="2026-04-24 23:59:55.349354431 +0000 UTC m=+355.630471351" watchObservedRunningTime="2026-04-24 23:59:55.349550104 +0000 UTC m=+355.630667023" Apr 25 00:00:43.999071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:43.999034 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-mmq4f"] Apr 25 00:00:44.004503 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.004480 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.006997 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.006975 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 25 00:00:44.007885 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.007867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 25 00:00:44.008001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.007884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 25 00:00:44.008001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.007903 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 25 00:00:44.008120 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.008010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-ljd5x\"" Apr 25 00:00:44.012794 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.012760 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-mmq4f"] Apr 25 00:00:44.171446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.171403 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-certificates\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.171446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.171448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c82js\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-kube-api-access-c82js\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.272475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.272433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-certificates\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.272475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.272477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c82js\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-kube-api-access-c82js\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.272671 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:00:44.272577 2576 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 25 00:00:44.272671 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:00:44.272602 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-mmq4f: secret "keda-admission-webhooks-certs" not found Apr 25 00:00:44.272671 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:00:44.272666 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-certificates podName:cfd1f7cf-1ff6-45f2-8157-d40715de4224 nodeName:}" failed. No retries permitted until 2026-04-25 00:00:44.772647423 +0000 UTC m=+405.053764323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-certificates") pod "keda-admission-cf49989db-mmq4f" (UID: "cfd1f7cf-1ff6-45f2-8157-d40715de4224") : secret "keda-admission-webhooks-certs" not found Apr 25 00:00:44.282812 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.282788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c82js\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-kube-api-access-c82js\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.776868 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.776841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-certificates\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.779479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.779446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cfd1f7cf-1ff6-45f2-8157-d40715de4224-certificates\") pod \"keda-admission-cf49989db-mmq4f\" (UID: \"cfd1f7cf-1ff6-45f2-8157-d40715de4224\") " pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:44.915130 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:44.915092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:45.033418 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:45.033391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-mmq4f"] Apr 25 00:00:45.035814 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:00:45.035785 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd1f7cf_1ff6_45f2_8157_d40715de4224.slice/crio-020117d34609b7a24df100799eef518dcb01497cdcf2a15ed054bb755b741ffa WatchSource:0}: Error finding container 020117d34609b7a24df100799eef518dcb01497cdcf2a15ed054bb755b741ffa: Status 404 returned error can't find the container with id 020117d34609b7a24df100799eef518dcb01497cdcf2a15ed054bb755b741ffa Apr 25 00:00:45.454275 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:45.454193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-mmq4f" event={"ID":"cfd1f7cf-1ff6-45f2-8157-d40715de4224","Type":"ContainerStarted","Data":"020117d34609b7a24df100799eef518dcb01497cdcf2a15ed054bb755b741ffa"} Apr 25 00:00:49.465050 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:49.465024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-mmq4f" event={"ID":"cfd1f7cf-1ff6-45f2-8157-d40715de4224","Type":"ContainerStarted","Data":"d3089a25b6ca38d829d6327a83dfe3988cb0799121b92aa2f178314a66f2f034"} Apr 25 00:00:49.465453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:49.465154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:00:49.500775 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:00:49.500730 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-mmq4f" podStartSLOduration=2.160459613 podStartE2EDuration="6.500715199s" podCreationTimestamp="2026-04-25 00:00:43 +0000 UTC" firstStartedPulling="2026-04-25 00:00:45.037216463 +0000 UTC m=+405.318333361" lastFinishedPulling="2026-04-25 00:00:49.377472047 +0000 UTC m=+409.658588947" observedRunningTime="2026-04-25 00:00:49.499109087 +0000 UTC m=+409.780226005" watchObservedRunningTime="2026-04-25 00:00:49.500715199 +0000 UTC m=+409.781832118" Apr 25 00:01:10.469509 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:10.469436 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-mmq4f" Apr 25 00:01:50.468641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.468607 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-24pcs"] Apr 25 00:01:50.470781 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.470765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.473424 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.473401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 25 00:01:50.473551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.473449 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dfwlg\"" Apr 25 00:01:50.474206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.474184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 25 00:01:50.474206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.474205 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 25 00:01:50.481040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.481020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-24pcs"] Apr 25 00:01:50.512637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.512601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/81057cd7-9d33-4ce0-ad47-d4eaca1082b7-data\") pod \"seaweedfs-86cc847c5c-24pcs\" (UID: \"81057cd7-9d33-4ce0-ad47-d4eaca1082b7\") " pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.512826 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.512643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h45pq\" (UniqueName: \"kubernetes.io/projected/81057cd7-9d33-4ce0-ad47-d4eaca1082b7-kube-api-access-h45pq\") pod \"seaweedfs-86cc847c5c-24pcs\" (UID: \"81057cd7-9d33-4ce0-ad47-d4eaca1082b7\") " pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.613696 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.613668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/81057cd7-9d33-4ce0-ad47-d4eaca1082b7-data\") pod \"seaweedfs-86cc847c5c-24pcs\" (UID: \"81057cd7-9d33-4ce0-ad47-d4eaca1082b7\") " pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.613853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.613705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h45pq\" (UniqueName: \"kubernetes.io/projected/81057cd7-9d33-4ce0-ad47-d4eaca1082b7-kube-api-access-h45pq\") pod \"seaweedfs-86cc847c5c-24pcs\" (UID: \"81057cd7-9d33-4ce0-ad47-d4eaca1082b7\") " pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.614214 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.614191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/81057cd7-9d33-4ce0-ad47-d4eaca1082b7-data\") pod \"seaweedfs-86cc847c5c-24pcs\" (UID: \"81057cd7-9d33-4ce0-ad47-d4eaca1082b7\") " pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.621792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.621769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h45pq\" (UniqueName: \"kubernetes.io/projected/81057cd7-9d33-4ce0-ad47-d4eaca1082b7-kube-api-access-h45pq\") pod \"seaweedfs-86cc847c5c-24pcs\" (UID: \"81057cd7-9d33-4ce0-ad47-d4eaca1082b7\") " pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.779560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.779535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:50.895072 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:50.895044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-24pcs"] Apr 25 00:01:50.897974 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:01:50.897947 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81057cd7_9d33_4ce0_ad47_d4eaca1082b7.slice/crio-468964be9648032eec1279f4a297d42c6b7c610fb20ea361d4bd16e0f446034b WatchSource:0}: Error finding container 468964be9648032eec1279f4a297d42c6b7c610fb20ea361d4bd16e0f446034b: Status 404 returned error can't find the container with id 468964be9648032eec1279f4a297d42c6b7c610fb20ea361d4bd16e0f446034b Apr 25 00:01:51.619964 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:51.619905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-24pcs" event={"ID":"81057cd7-9d33-4ce0-ad47-d4eaca1082b7","Type":"ContainerStarted","Data":"468964be9648032eec1279f4a297d42c6b7c610fb20ea361d4bd16e0f446034b"} Apr 25 00:01:53.626530 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:53.626493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-24pcs" event={"ID":"81057cd7-9d33-4ce0-ad47-d4eaca1082b7","Type":"ContainerStarted","Data":"1592bdd87cc46bacae684ba3f9f84f07bdc83fd0be5264dc2926ffe6162677f8"} Apr 25 00:01:53.627025 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:53.626635 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:01:53.643996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:53.643954 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-24pcs" podStartSLOduration=1.324057402 podStartE2EDuration="3.643941994s" podCreationTimestamp="2026-04-25 00:01:50 +0000 UTC" firstStartedPulling="2026-04-25 00:01:50.899546335 +0000 UTC m=+471.180663233" lastFinishedPulling="2026-04-25 00:01:53.219430926 +0000 UTC m=+473.500547825" observedRunningTime="2026-04-25 00:01:53.642513751 +0000 UTC m=+473.923630662" watchObservedRunningTime="2026-04-25 00:01:53.643941994 +0000 UTC m=+473.925058907" Apr 25 00:01:59.632769 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:01:59.631837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-24pcs" Apr 25 00:03:00.195541 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.195506 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-cf89d"] Apr 25 00:03:00.198494 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.198476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.201087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.201064 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-h8dkt\"" Apr 25 00:03:00.201182 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.201086 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 25 00:03:00.210732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.210705 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cf89d"] Apr 25 00:03:00.211791 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.211769 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-wpr5g"] Apr 25 00:03:00.214657 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.214639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.217022 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.216999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 25 00:03:00.217102 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.217005 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-gbnr6\"" Apr 25 00:03:00.226238 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.226216 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wpr5g"] Apr 25 00:03:00.289315 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.289290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fe5d32-305c-48ba-af24-e0fc40b04868-tls-certs\") pod \"model-serving-api-86f7b4b499-cf89d\" (UID: \"d6fe5d32-305c-48ba-af24-e0fc40b04868\") " pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.289436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.289325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fd9f0c0-07d6-482b-b3b6-e3c5ee980597-cert\") pod \"odh-model-controller-696fc77849-wpr5g\" (UID: \"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597\") " pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.289436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.289350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xllrt\" (UniqueName: \"kubernetes.io/projected/d6fe5d32-305c-48ba-af24-e0fc40b04868-kube-api-access-xllrt\") pod \"model-serving-api-86f7b4b499-cf89d\" (UID: \"d6fe5d32-305c-48ba-af24-e0fc40b04868\") " pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.289436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.289429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkh4m\" (UniqueName: \"kubernetes.io/projected/5fd9f0c0-07d6-482b-b3b6-e3c5ee980597-kube-api-access-rkh4m\") pod \"odh-model-controller-696fc77849-wpr5g\" (UID: \"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597\") " pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.390110 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.390080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fe5d32-305c-48ba-af24-e0fc40b04868-tls-certs\") pod \"model-serving-api-86f7b4b499-cf89d\" (UID: \"d6fe5d32-305c-48ba-af24-e0fc40b04868\") " pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.390110 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.390119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fd9f0c0-07d6-482b-b3b6-e3c5ee980597-cert\") pod \"odh-model-controller-696fc77849-wpr5g\" (UID: \"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597\") " pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.390374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.390143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xllrt\" (UniqueName: \"kubernetes.io/projected/d6fe5d32-305c-48ba-af24-e0fc40b04868-kube-api-access-xllrt\") pod \"model-serving-api-86f7b4b499-cf89d\" (UID: \"d6fe5d32-305c-48ba-af24-e0fc40b04868\") " pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.390374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.390271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkh4m\" (UniqueName: \"kubernetes.io/projected/5fd9f0c0-07d6-482b-b3b6-e3c5ee980597-kube-api-access-rkh4m\") pod \"odh-model-controller-696fc77849-wpr5g\" (UID: \"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597\") " pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.392638 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.392613 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d6fe5d32-305c-48ba-af24-e0fc40b04868-tls-certs\") pod \"model-serving-api-86f7b4b499-cf89d\" (UID: \"d6fe5d32-305c-48ba-af24-e0fc40b04868\") " pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.392733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.392666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5fd9f0c0-07d6-482b-b3b6-e3c5ee980597-cert\") pod \"odh-model-controller-696fc77849-wpr5g\" (UID: \"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597\") " pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.401039 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.401017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkh4m\" (UniqueName: \"kubernetes.io/projected/5fd9f0c0-07d6-482b-b3b6-e3c5ee980597-kube-api-access-rkh4m\") pod \"odh-model-controller-696fc77849-wpr5g\" (UID: \"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597\") " pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.401123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.401076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xllrt\" (UniqueName: \"kubernetes.io/projected/d6fe5d32-305c-48ba-af24-e0fc40b04868-kube-api-access-xllrt\") pod \"model-serving-api-86f7b4b499-cf89d\" (UID: \"d6fe5d32-305c-48ba-af24-e0fc40b04868\") " pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.513099 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.513010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:00.523905 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.523882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:00.639685 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.639652 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cf89d"] Apr 25 00:03:00.644173 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:03:00.644148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6fe5d32_305c_48ba_af24_e0fc40b04868.slice/crio-6bb184489bd14752f61e191f98793ea33f0d33ef9d4f8d4bfd725d472e88bb49 WatchSource:0}: Error finding container 6bb184489bd14752f61e191f98793ea33f0d33ef9d4f8d4bfd725d472e88bb49: Status 404 returned error can't find the container with id 6bb184489bd14752f61e191f98793ea33f0d33ef9d4f8d4bfd725d472e88bb49 Apr 25 00:03:00.655310 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.655270 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wpr5g"] Apr 25 00:03:00.657484 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:03:00.657459 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd9f0c0_07d6_482b_b3b6_e3c5ee980597.slice/crio-a836500fb486223af6e4c884e9973f617fe7386d4dbb49fb23a7abee6cb7b670 WatchSource:0}: Error finding container a836500fb486223af6e4c884e9973f617fe7386d4dbb49fb23a7abee6cb7b670: Status 404 returned error can't find the container with id a836500fb486223af6e4c884e9973f617fe7386d4dbb49fb23a7abee6cb7b670 Apr 25 00:03:00.793931 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.793877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cf89d" event={"ID":"d6fe5d32-305c-48ba-af24-e0fc40b04868","Type":"ContainerStarted","Data":"6bb184489bd14752f61e191f98793ea33f0d33ef9d4f8d4bfd725d472e88bb49"} Apr 25 00:03:00.794780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:00.794757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wpr5g" event={"ID":"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597","Type":"ContainerStarted","Data":"a836500fb486223af6e4c884e9973f617fe7386d4dbb49fb23a7abee6cb7b670"} Apr 25 00:03:04.807997 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:04.807955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wpr5g" event={"ID":"5fd9f0c0-07d6-482b-b3b6-e3c5ee980597","Type":"ContainerStarted","Data":"b3a44fff30394be45b6d2ff18f03c8f02c6d13d136b2f8a283ae3b955b19254e"} Apr 25 00:03:04.808438 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:04.808060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:04.809333 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:04.809298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cf89d" event={"ID":"d6fe5d32-305c-48ba-af24-e0fc40b04868","Type":"ContainerStarted","Data":"2ac8ecf00bd963f2db178c5b13a3401ee652ec5a664272a1135626d5804c6caa"} Apr 25 00:03:04.809445 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:04.809422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:04.851675 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:04.851632 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-wpr5g" podStartSLOduration=1.170129846 podStartE2EDuration="4.85162136s" podCreationTimestamp="2026-04-25 00:03:00 +0000 UTC" firstStartedPulling="2026-04-25 00:03:00.658870595 +0000 UTC m=+540.939987493" lastFinishedPulling="2026-04-25 00:03:04.340362093 +0000 UTC m=+544.621479007" observedRunningTime="2026-04-25 00:03:04.847768846 +0000 UTC m=+545.128885765" watchObservedRunningTime="2026-04-25 00:03:04.85162136 +0000 UTC m=+545.132738295" Apr 25 00:03:15.814718 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:15.814686 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-wpr5g" Apr 25 00:03:15.816773 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:15.816750 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-cf89d" Apr 25 00:03:15.831598 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:15.831554 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-cf89d" podStartSLOduration=12.145702312 podStartE2EDuration="15.831542979s" podCreationTimestamp="2026-04-25 00:03:00 +0000 UTC" firstStartedPulling="2026-04-25 00:03:00.64590522 +0000 UTC m=+540.927022117" lastFinishedPulling="2026-04-25 00:03:04.331745883 +0000 UTC m=+544.612862784" observedRunningTime="2026-04-25 00:03:04.883359613 +0000 UTC m=+545.164476535" watchObservedRunningTime="2026-04-25 00:03:15.831542979 +0000 UTC m=+556.112659900" Apr 25 00:03:16.609345 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.609311 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-44llv"] Apr 25 00:03:16.612235 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.612220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-44llv" Apr 25 00:03:16.618449 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.618421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-44llv"] Apr 25 00:03:16.721256 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.721224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqwrz\" (UniqueName: \"kubernetes.io/projected/5a576370-66a9-4fd2-b563-7c3efaa5712d-kube-api-access-mqwrz\") pod \"s3-init-44llv\" (UID: \"5a576370-66a9-4fd2-b563-7c3efaa5712d\") " pod="kserve/s3-init-44llv" Apr 25 00:03:16.822482 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.822448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqwrz\" (UniqueName: \"kubernetes.io/projected/5a576370-66a9-4fd2-b563-7c3efaa5712d-kube-api-access-mqwrz\") pod \"s3-init-44llv\" (UID: \"5a576370-66a9-4fd2-b563-7c3efaa5712d\") " pod="kserve/s3-init-44llv" Apr 25 00:03:16.831083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.831062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqwrz\" (UniqueName: \"kubernetes.io/projected/5a576370-66a9-4fd2-b563-7c3efaa5712d-kube-api-access-mqwrz\") pod \"s3-init-44llv\" (UID: \"5a576370-66a9-4fd2-b563-7c3efaa5712d\") " pod="kserve/s3-init-44llv" Apr 25 00:03:16.921358 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:16.921298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-44llv" Apr 25 00:03:17.036371 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:17.036340 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-44llv"] Apr 25 00:03:17.040218 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:03:17.040184 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a576370_66a9_4fd2_b563_7c3efaa5712d.slice/crio-e55c08fcd61ce860b74b5ba4cc9e758bb6fa339d96d13058e3fe602dc625fc89 WatchSource:0}: Error finding container e55c08fcd61ce860b74b5ba4cc9e758bb6fa339d96d13058e3fe602dc625fc89: Status 404 returned error can't find the container with id e55c08fcd61ce860b74b5ba4cc9e758bb6fa339d96d13058e3fe602dc625fc89 Apr 25 00:03:17.852986 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:17.852943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-44llv" event={"ID":"5a576370-66a9-4fd2-b563-7c3efaa5712d","Type":"ContainerStarted","Data":"e55c08fcd61ce860b74b5ba4cc9e758bb6fa339d96d13058e3fe602dc625fc89"} Apr 25 00:03:21.865784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:21.865745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-44llv" event={"ID":"5a576370-66a9-4fd2-b563-7c3efaa5712d","Type":"ContainerStarted","Data":"d884126b228418df4c682de5dd0576f9c6977b767012c166690e2b503fcfd7a2"} Apr 25 00:03:21.880450 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:21.880406 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-44llv" podStartSLOduration=1.565959608 podStartE2EDuration="5.880393377s" podCreationTimestamp="2026-04-25 00:03:16 +0000 UTC" firstStartedPulling="2026-04-25 00:03:17.042080425 +0000 UTC m=+557.323197337" lastFinishedPulling="2026-04-25 00:03:21.356514205 +0000 UTC m=+561.637631106" observedRunningTime="2026-04-25 00:03:21.879499096 +0000 UTC m=+562.160616016" watchObservedRunningTime="2026-04-25 00:03:21.880393377 +0000 UTC m=+562.161510297" Apr 25 00:03:24.875068 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:24.875034 2576 generic.go:358] "Generic (PLEG): container finished" podID="5a576370-66a9-4fd2-b563-7c3efaa5712d" containerID="d884126b228418df4c682de5dd0576f9c6977b767012c166690e2b503fcfd7a2" exitCode=0 Apr 25 00:03:24.875431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:24.875109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-44llv" event={"ID":"5a576370-66a9-4fd2-b563-7c3efaa5712d","Type":"ContainerDied","Data":"d884126b228418df4c682de5dd0576f9c6977b767012c166690e2b503fcfd7a2"} Apr 25 00:03:25.999044 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:25.999024 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-44llv" Apr 25 00:03:26.091073 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:26.091035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqwrz\" (UniqueName: \"kubernetes.io/projected/5a576370-66a9-4fd2-b563-7c3efaa5712d-kube-api-access-mqwrz\") pod \"5a576370-66a9-4fd2-b563-7c3efaa5712d\" (UID: \"5a576370-66a9-4fd2-b563-7c3efaa5712d\") " Apr 25 00:03:26.093219 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:26.093190 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a576370-66a9-4fd2-b563-7c3efaa5712d-kube-api-access-mqwrz" (OuterVolumeSpecName: "kube-api-access-mqwrz") pod "5a576370-66a9-4fd2-b563-7c3efaa5712d" (UID: "5a576370-66a9-4fd2-b563-7c3efaa5712d"). InnerVolumeSpecName "kube-api-access-mqwrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:03:26.192100 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:26.192021 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqwrz\" (UniqueName: \"kubernetes.io/projected/5a576370-66a9-4fd2-b563-7c3efaa5712d-kube-api-access-mqwrz\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:03:26.881346 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:26.881322 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-44llv" Apr 25 00:03:26.881517 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:26.881316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-44llv" event={"ID":"5a576370-66a9-4fd2-b563-7c3efaa5712d","Type":"ContainerDied","Data":"e55c08fcd61ce860b74b5ba4cc9e758bb6fa339d96d13058e3fe602dc625fc89"} Apr 25 00:03:26.881517 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:26.881424 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55c08fcd61ce860b74b5ba4cc9e758bb6fa339d96d13058e3fe602dc625fc89" Apr 25 00:03:27.556050 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.556020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr"] Apr 25 00:03:27.556409 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.556303 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a576370-66a9-4fd2-b563-7c3efaa5712d" containerName="s3-init" Apr 25 00:03:27.556409 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.556313 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a576370-66a9-4fd2-b563-7c3efaa5712d" containerName="s3-init" Apr 25 00:03:27.556409 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.556353 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a576370-66a9-4fd2-b563-7c3efaa5712d" containerName="s3-init" Apr 25 00:03:27.559290 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.559269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.561620 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.561599 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 25 00:03:27.565884 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.565662 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr"] Apr 25 00:03:27.603197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.603177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-r9hnr\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.603298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.603219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgxj2\" (UniqueName: \"kubernetes.io/projected/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-kube-api-access-zgxj2\") pod \"seaweedfs-tls-custom-ddd4dbfd-r9hnr\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.703781 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.703753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgxj2\" (UniqueName: \"kubernetes.io/projected/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-kube-api-access-zgxj2\") pod \"seaweedfs-tls-custom-ddd4dbfd-r9hnr\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.704147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.703819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-r9hnr\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.704233 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.704218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-r9hnr\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.711702 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.711682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgxj2\" (UniqueName: \"kubernetes.io/projected/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-kube-api-access-zgxj2\") pod \"seaweedfs-tls-custom-ddd4dbfd-r9hnr\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.868865 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.868793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:27.983724 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:27.983689 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr"] Apr 25 00:03:27.986628 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:03:27.986601 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ca8e1a_2e6b_4db9_8efc_6d7cf3560a32.slice/crio-c8b2a856b9bb647df4ab3eb1c9605d48a67ae5bb32a3f12cf6c5f309d9be46e9 WatchSource:0}: Error finding container c8b2a856b9bb647df4ab3eb1c9605d48a67ae5bb32a3f12cf6c5f309d9be46e9: Status 404 returned error can't find the container with id c8b2a856b9bb647df4ab3eb1c9605d48a67ae5bb32a3f12cf6c5f309d9be46e9 Apr 25 00:03:28.887332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:28.887290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" event={"ID":"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32","Type":"ContainerStarted","Data":"c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c"} Apr 25 00:03:28.887332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:28.887332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" event={"ID":"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32","Type":"ContainerStarted","Data":"c8b2a856b9bb647df4ab3eb1c9605d48a67ae5bb32a3f12cf6c5f309d9be46e9"} Apr 25 00:03:28.901555 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:28.901480 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" podStartSLOduration=1.641228945 podStartE2EDuration="1.901466699s" podCreationTimestamp="2026-04-25 00:03:27 +0000 UTC" firstStartedPulling="2026-04-25 00:03:27.988160381 +0000 UTC m=+568.269277279" lastFinishedPulling="2026-04-25 00:03:28.248398135 +0000 UTC m=+568.529515033" observedRunningTime="2026-04-25 00:03:28.900349418 +0000 UTC m=+569.181466365" watchObservedRunningTime="2026-04-25 00:03:28.901466699 +0000 UTC m=+569.182583618" Apr 25 00:03:30.345169 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:30.345138 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr"] Apr 25 00:03:30.892728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:30.892692 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" podUID="83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" containerName="seaweedfs-tls-custom" containerID="cri-o://c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c" gracePeriod=30 Apr 25 00:03:58.927987 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.927959 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:58.969343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.969302 2576 generic.go:358] "Generic (PLEG): container finished" podID="83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" containerID="c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c" exitCode=0 Apr 25 00:03:58.969501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.969367 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" Apr 25 00:03:58.969501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.969376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" event={"ID":"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32","Type":"ContainerDied","Data":"c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c"} Apr 25 00:03:58.969501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.969431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr" event={"ID":"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32","Type":"ContainerDied","Data":"c8b2a856b9bb647df4ab3eb1c9605d48a67ae5bb32a3f12cf6c5f309d9be46e9"} Apr 25 00:03:58.969501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.969451 2576 scope.go:117] "RemoveContainer" containerID="c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c" Apr 25 00:03:58.980005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.979990 2576 scope.go:117] "RemoveContainer" containerID="c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c" Apr 25 00:03:58.980271 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:03:58.980255 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c\": container with ID starting with c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c not found: ID does not exist" containerID="c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c" Apr 25 00:03:58.980328 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:58.980279 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c"} err="failed to get container status \"c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c\": rpc error: code = NotFound desc = could not find container \"c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c\": container with ID starting with c1d957c11d39b6201ebdf4321a1bb07a9e5ec30b01187e800de61be5a762632c not found: ID does not exist" Apr 25 00:03:59.016573 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.016556 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-data\") pod \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " Apr 25 00:03:59.016695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.016613 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgxj2\" (UniqueName: \"kubernetes.io/projected/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-kube-api-access-zgxj2\") pod \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\" (UID: \"83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32\") " Apr 25 00:03:59.017811 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.017788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-data" (OuterVolumeSpecName: "data") pod "83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" (UID: "83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:03:59.018632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.018612 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-kube-api-access-zgxj2" (OuterVolumeSpecName: "kube-api-access-zgxj2") pod "83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" (UID: "83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32"). InnerVolumeSpecName "kube-api-access-zgxj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:03:59.117565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.117531 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgxj2\" (UniqueName: \"kubernetes.io/projected/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-kube-api-access-zgxj2\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:03:59.117565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.117561 2576 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32-data\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:03:59.293554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.290131 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr"] Apr 25 00:03:59.295382 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:03:59.295355 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-r9hnr"] Apr 25 00:04:00.209828 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:00.209803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:04:00.211343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:00.211318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:04:00.314890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:00.314862 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" path="/var/lib/kubelet/pods/83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32/volumes" Apr 25 00:04:03.057803 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.057770 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-2g2bm"] Apr 25 00:04:03.058190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.058062 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" containerName="seaweedfs-tls-custom" Apr 25 00:04:03.058190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.058075 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" containerName="seaweedfs-tls-custom" Apr 25 00:04:03.058190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.058138 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="83ca8e1a-2e6b-4db9-8efc-6d7cf3560a32" containerName="seaweedfs-tls-custom" Apr 25 00:04:03.063061 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.063041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:03.065496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.065482 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 25 00:04:03.069839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.069818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-2g2bm"] Apr 25 00:04:03.143158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.143128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2jl2\" (UniqueName: \"kubernetes.io/projected/5337f27c-0430-45dc-99d5-dc9a28b10f64-kube-api-access-n2jl2\") pod \"s3-tls-init-custom-2g2bm\" (UID: \"5337f27c-0430-45dc-99d5-dc9a28b10f64\") " pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:03.243652 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.243615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2jl2\" (UniqueName: \"kubernetes.io/projected/5337f27c-0430-45dc-99d5-dc9a28b10f64-kube-api-access-n2jl2\") pod \"s3-tls-init-custom-2g2bm\" (UID: \"5337f27c-0430-45dc-99d5-dc9a28b10f64\") " pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:03.251910 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.251883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2jl2\" (UniqueName: \"kubernetes.io/projected/5337f27c-0430-45dc-99d5-dc9a28b10f64-kube-api-access-n2jl2\") pod \"s3-tls-init-custom-2g2bm\" (UID: \"5337f27c-0430-45dc-99d5-dc9a28b10f64\") " pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:03.371497 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.371440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:03.486750 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.486717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-2g2bm"] Apr 25 00:04:03.489442 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:04:03.489414 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5337f27c_0430_45dc_99d5_dc9a28b10f64.slice/crio-3d31811792b2b4682dcb0b5a0d186cd754eac9bd83bb0a9ea69ea513e136bcf9 WatchSource:0}: Error finding container 3d31811792b2b4682dcb0b5a0d186cd754eac9bd83bb0a9ea69ea513e136bcf9: Status 404 returned error can't find the container with id 3d31811792b2b4682dcb0b5a0d186cd754eac9bd83bb0a9ea69ea513e136bcf9 Apr 25 00:04:03.984837 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.984801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2g2bm" event={"ID":"5337f27c-0430-45dc-99d5-dc9a28b10f64","Type":"ContainerStarted","Data":"0148834e726c300947a4b05491418e105d8999b26756676af0ab51a39ad1a724"} Apr 25 00:04:03.984837 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.984836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2g2bm" event={"ID":"5337f27c-0430-45dc-99d5-dc9a28b10f64","Type":"ContainerStarted","Data":"3d31811792b2b4682dcb0b5a0d186cd754eac9bd83bb0a9ea69ea513e136bcf9"} Apr 25 00:04:03.998755 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:03.998521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-2g2bm" podStartSLOduration=0.998503102 podStartE2EDuration="998.503102ms" podCreationTimestamp="2026-04-25 00:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:04:03.99791023 +0000 UTC m=+604.279027152" watchObservedRunningTime="2026-04-25 00:04:03.998503102 +0000 UTC m=+604.279620024" Apr 25 00:04:10.001890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:10.001857 2576 generic.go:358] "Generic (PLEG): container finished" podID="5337f27c-0430-45dc-99d5-dc9a28b10f64" containerID="0148834e726c300947a4b05491418e105d8999b26756676af0ab51a39ad1a724" exitCode=0 Apr 25 00:04:10.002302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:10.001904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2g2bm" event={"ID":"5337f27c-0430-45dc-99d5-dc9a28b10f64","Type":"ContainerDied","Data":"0148834e726c300947a4b05491418e105d8999b26756676af0ab51a39ad1a724"} Apr 25 00:04:11.126676 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:11.126655 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:11.199607 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:11.199575 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2jl2\" (UniqueName: \"kubernetes.io/projected/5337f27c-0430-45dc-99d5-dc9a28b10f64-kube-api-access-n2jl2\") pod \"5337f27c-0430-45dc-99d5-dc9a28b10f64\" (UID: \"5337f27c-0430-45dc-99d5-dc9a28b10f64\") " Apr 25 00:04:11.201663 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:11.201635 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5337f27c-0430-45dc-99d5-dc9a28b10f64-kube-api-access-n2jl2" (OuterVolumeSpecName: "kube-api-access-n2jl2") pod "5337f27c-0430-45dc-99d5-dc9a28b10f64" (UID: "5337f27c-0430-45dc-99d5-dc9a28b10f64"). InnerVolumeSpecName "kube-api-access-n2jl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:04:11.300648 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:11.300627 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2jl2\" (UniqueName: \"kubernetes.io/projected/5337f27c-0430-45dc-99d5-dc9a28b10f64-kube-api-access-n2jl2\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:04:12.007839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.007806 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2g2bm" Apr 25 00:04:12.007839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.007829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2g2bm" event={"ID":"5337f27c-0430-45dc-99d5-dc9a28b10f64","Type":"ContainerDied","Data":"3d31811792b2b4682dcb0b5a0d186cd754eac9bd83bb0a9ea69ea513e136bcf9"} Apr 25 00:04:12.007839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.007864 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d31811792b2b4682dcb0b5a0d186cd754eac9bd83bb0a9ea69ea513e136bcf9" Apr 25 00:04:12.577117 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.577082 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc"] Apr 25 00:04:12.577477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.577331 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5337f27c-0430-45dc-99d5-dc9a28b10f64" containerName="s3-tls-init-custom" Apr 25 00:04:12.577477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.577341 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5337f27c-0430-45dc-99d5-dc9a28b10f64" containerName="s3-tls-init-custom" Apr 25 00:04:12.577477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.577393 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5337f27c-0430-45dc-99d5-dc9a28b10f64" containerName="s3-tls-init-custom" Apr 25 00:04:12.579882 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.579865 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.582260 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.582229 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 25 00:04:12.582391 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.582376 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 25 00:04:12.587354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.587330 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc"] Apr 25 00:04:12.711190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.711163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a0f88363-12b0-46f6-96d5-796d9680366f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.711345 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.711197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/a0f88363-12b0-46f6-96d5-796d9680366f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.711345 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.711259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8ctw\" (UniqueName: \"kubernetes.io/projected/a0f88363-12b0-46f6-96d5-796d9680366f-kube-api-access-k8ctw\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.812503 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.812468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8ctw\" (UniqueName: \"kubernetes.io/projected/a0f88363-12b0-46f6-96d5-796d9680366f-kube-api-access-k8ctw\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.812637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.812526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a0f88363-12b0-46f6-96d5-796d9680366f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.812637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.812545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/a0f88363-12b0-46f6-96d5-796d9680366f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.813030 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.813012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a0f88363-12b0-46f6-96d5-796d9680366f-data\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.814962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.814945 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/a0f88363-12b0-46f6-96d5-796d9680366f-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.821524 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.821499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8ctw\" (UniqueName: \"kubernetes.io/projected/a0f88363-12b0-46f6-96d5-796d9680366f-kube-api-access-k8ctw\") pod \"seaweedfs-tls-serving-7fd5766db9-6jtgc\" (UID: \"a0f88363-12b0-46f6-96d5-796d9680366f\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:12.889236 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:12.889169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" Apr 25 00:04:13.028555 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:13.028410 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc"] Apr 25 00:04:13.031135 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:04:13.031105 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f88363_12b0_46f6_96d5_796d9680366f.slice/crio-0e06d58658f0d5b28af67c11c7b2eec0bea13c1f609a15aee77db7accad524c7 WatchSource:0}: Error finding container 0e06d58658f0d5b28af67c11c7b2eec0bea13c1f609a15aee77db7accad524c7: Status 404 returned error can't find the container with id 0e06d58658f0d5b28af67c11c7b2eec0bea13c1f609a15aee77db7accad524c7 Apr 25 00:04:14.014980 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.014942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" event={"ID":"a0f88363-12b0-46f6-96d5-796d9680366f","Type":"ContainerStarted","Data":"39f56bfffa5227b7ffa858295503e4d4e18983489bb28d04f5d5cc3f92f3a9d0"} Apr 25 00:04:14.014980 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.014982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" event={"ID":"a0f88363-12b0-46f6-96d5-796d9680366f","Type":"ContainerStarted","Data":"0e06d58658f0d5b28af67c11c7b2eec0bea13c1f609a15aee77db7accad524c7"} Apr 25 00:04:14.029822 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.029775 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6jtgc" podStartSLOduration=1.765008902 podStartE2EDuration="2.02976127s" podCreationTimestamp="2026-04-25 00:04:12 +0000 UTC" firstStartedPulling="2026-04-25 00:04:13.032298445 +0000 UTC m=+613.313415343" lastFinishedPulling="2026-04-25 00:04:13.297050813 +0000 UTC m=+613.578167711" observedRunningTime="2026-04-25 00:04:14.028246021 +0000 UTC m=+614.309362952" watchObservedRunningTime="2026-04-25 00:04:14.02976127 +0000 UTC m=+614.310878190" Apr 25 00:04:14.534005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.533967 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-9rbcb"] Apr 25 00:04:14.538131 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.538106 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-9rbcb"] Apr 25 00:04:14.538260 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.538225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:14.627765 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.627731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlbt\" (UniqueName: \"kubernetes.io/projected/9b14cfaf-b24c-4ab0-b9f9-2f378c08f836-kube-api-access-mxlbt\") pod \"s3-tls-init-serving-9rbcb\" (UID: \"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836\") " pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:14.728806 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.728766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlbt\" (UniqueName: \"kubernetes.io/projected/9b14cfaf-b24c-4ab0-b9f9-2f378c08f836-kube-api-access-mxlbt\") pod \"s3-tls-init-serving-9rbcb\" (UID: \"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836\") " pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:14.737155 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.737127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlbt\" (UniqueName: \"kubernetes.io/projected/9b14cfaf-b24c-4ab0-b9f9-2f378c08f836-kube-api-access-mxlbt\") pod \"s3-tls-init-serving-9rbcb\" (UID: \"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836\") " pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:14.847807 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.847745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:14.960136 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:14.960104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-9rbcb"] Apr 25 00:04:14.962772 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:04:14.962745 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b14cfaf_b24c_4ab0_b9f9_2f378c08f836.slice/crio-b2e2bf22c87151ab1351d871f7cc0d496425ff0b98cc21c91e11d4333fcaaea2 WatchSource:0}: Error finding container b2e2bf22c87151ab1351d871f7cc0d496425ff0b98cc21c91e11d4333fcaaea2: Status 404 returned error can't find the container with id b2e2bf22c87151ab1351d871f7cc0d496425ff0b98cc21c91e11d4333fcaaea2 Apr 25 00:04:15.019998 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:15.019969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-9rbcb" event={"ID":"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836","Type":"ContainerStarted","Data":"2270542893d35bf1814664bb8db1f74124c7fd823117b966cd94ed193ae9369d"} Apr 25 00:04:15.020358 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:15.020008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-9rbcb" event={"ID":"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836","Type":"ContainerStarted","Data":"b2e2bf22c87151ab1351d871f7cc0d496425ff0b98cc21c91e11d4333fcaaea2"} Apr 25 00:04:20.034675 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:20.034641 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b14cfaf-b24c-4ab0-b9f9-2f378c08f836" containerID="2270542893d35bf1814664bb8db1f74124c7fd823117b966cd94ed193ae9369d" exitCode=0 Apr 25 00:04:20.035059 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:20.034690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-9rbcb" event={"ID":"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836","Type":"ContainerDied","Data":"2270542893d35bf1814664bb8db1f74124c7fd823117b966cd94ed193ae9369d"} Apr 25 00:04:21.171452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:21.171419 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:21.282020 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:21.281993 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxlbt\" (UniqueName: \"kubernetes.io/projected/9b14cfaf-b24c-4ab0-b9f9-2f378c08f836-kube-api-access-mxlbt\") pod \"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836\" (UID: \"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836\") " Apr 25 00:04:21.284102 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:21.284078 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b14cfaf-b24c-4ab0-b9f9-2f378c08f836-kube-api-access-mxlbt" (OuterVolumeSpecName: "kube-api-access-mxlbt") pod "9b14cfaf-b24c-4ab0-b9f9-2f378c08f836" (UID: "9b14cfaf-b24c-4ab0-b9f9-2f378c08f836"). InnerVolumeSpecName "kube-api-access-mxlbt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:04:21.382716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:21.382690 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxlbt\" (UniqueName: \"kubernetes.io/projected/9b14cfaf-b24c-4ab0-b9f9-2f378c08f836-kube-api-access-mxlbt\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:04:22.041140 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:22.041112 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-9rbcb" Apr 25 00:04:22.041284 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:22.041113 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-9rbcb" event={"ID":"9b14cfaf-b24c-4ab0-b9f9-2f378c08f836","Type":"ContainerDied","Data":"b2e2bf22c87151ab1351d871f7cc0d496425ff0b98cc21c91e11d4333fcaaea2"} Apr 25 00:04:22.041284 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:22.041221 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2e2bf22c87151ab1351d871f7cc0d496425ff0b98cc21c91e11d4333fcaaea2" Apr 25 00:04:31.314070 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.314042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6"] Apr 25 00:04:31.314428 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.314326 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b14cfaf-b24c-4ab0-b9f9-2f378c08f836" containerName="s3-tls-init-serving" Apr 25 00:04:31.314428 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.314337 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b14cfaf-b24c-4ab0-b9f9-2f378c08f836" containerName="s3-tls-init-serving" Apr 25 00:04:31.314428 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.314379 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b14cfaf-b24c-4ab0-b9f9-2f378c08f836" containerName="s3-tls-init-serving" Apr 25 00:04:31.317511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.317493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.319839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.319813 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:04:31.319992 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.319910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 25 00:04:31.319992 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.319943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kz9zk\"" Apr 25 00:04:31.319992 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.319947 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 25 00:04:31.320820 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.320802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:04:31.326814 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.326789 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6"] Apr 25 00:04:31.355475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.355448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61e5ef93-f2a2-4790-959d-834c553d9929-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.355551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.355497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61e5ef93-f2a2-4790-959d-834c553d9929-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.355551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.355539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.355628 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.355576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvvv5\" (UniqueName: \"kubernetes.io/projected/61e5ef93-f2a2-4790-959d-834c553d9929-kube-api-access-hvvv5\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.456818 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.456787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvvv5\" (UniqueName: \"kubernetes.io/projected/61e5ef93-f2a2-4790-959d-834c553d9929-kube-api-access-hvvv5\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.456961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.456834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61e5ef93-f2a2-4790-959d-834c553d9929-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.456961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.456866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61e5ef93-f2a2-4790-959d-834c553d9929-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.457073 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.457026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.457183 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:04:31.457159 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-serving-cert: secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 25 00:04:31.457262 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:04:31.457248 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls podName:61e5ef93-f2a2-4790-959d-834c553d9929 nodeName:}" failed. No retries permitted until 2026-04-25 00:04:31.957224298 +0000 UTC m=+632.238341197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls") pod "isvc-sklearn-batcher-predictor-77df99677f-dd6m6" (UID: "61e5ef93-f2a2-4790-959d-834c553d9929") : secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 25 00:04:31.457328 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.457267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61e5ef93-f2a2-4790-959d-834c553d9929-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.457543 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.457523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61e5ef93-f2a2-4790-959d-834c553d9929-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.465322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.465304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvvv5\" (UniqueName: \"kubernetes.io/projected/61e5ef93-f2a2-4790-959d-834c553d9929-kube-api-access-hvvv5\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.961272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.961242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:31.963742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:31.963710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-77df99677f-dd6m6\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:32.228472 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:32.228395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:32.348756 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:32.348612 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6"] Apr 25 00:04:32.351388 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:04:32.351361 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e5ef93_f2a2_4790_959d_834c553d9929.slice/crio-0e2322b4f6b94f77fcb6cf06d1d5f357857163c607632eba8cf6bd65d12d71da WatchSource:0}: Error finding container 0e2322b4f6b94f77fcb6cf06d1d5f357857163c607632eba8cf6bd65d12d71da: Status 404 returned error can't find the container with id 0e2322b4f6b94f77fcb6cf06d1d5f357857163c607632eba8cf6bd65d12d71da Apr 25 00:04:33.074302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:33.074263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerStarted","Data":"0e2322b4f6b94f77fcb6cf06d1d5f357857163c607632eba8cf6bd65d12d71da"} Apr 25 00:04:36.083674 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:36.083635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerStarted","Data":"27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669"} Apr 25 00:04:39.092757 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:39.092724 2576 generic.go:358] "Generic (PLEG): container finished" podID="61e5ef93-f2a2-4790-959d-834c553d9929" containerID="27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669" exitCode=0 Apr 25 00:04:39.093137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:39.092809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerDied","Data":"27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669"} Apr 25 00:04:52.389390 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:52.389370 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:04:53.144578 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:53.144517 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerStarted","Data":"71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373"} Apr 25 00:04:55.152028 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:55.151991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerStarted","Data":"76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934"} Apr 25 00:04:57.160809 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:57.160713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerStarted","Data":"59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1"} Apr 25 00:04:57.161202 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:57.160942 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:57.161202 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:57.161071 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:57.162246 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:57.162221 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:04:57.180578 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:57.180527 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podStartSLOduration=1.710411192 podStartE2EDuration="26.180515863s" podCreationTimestamp="2026-04-25 00:04:31 +0000 UTC" firstStartedPulling="2026-04-25 00:04:32.353271796 +0000 UTC m=+632.634388693" lastFinishedPulling="2026-04-25 00:04:56.823376466 +0000 UTC m=+657.104493364" observedRunningTime="2026-04-25 00:04:57.178731844 +0000 UTC m=+657.459848765" watchObservedRunningTime="2026-04-25 00:04:57.180515863 +0000 UTC m=+657.461632821" Apr 25 00:04:58.163991 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:58.163949 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:04:58.164440 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:58.164120 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:04:58.165149 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:58.165124 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:04:59.167145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:59.167104 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:04:59.167643 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:59.167540 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:04:59.170719 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:04:59.170700 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:05:00.169958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:00.169891 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:05:00.170387 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:00.170260 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:10.170609 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:10.170548 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:05:10.171091 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:10.171063 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:20.170747 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:20.170700 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:05:20.171239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:20.171181 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:30.170611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:30.170567 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:05:30.171082 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:30.171055 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:40.170627 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:40.170526 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:05:40.171055 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:40.171036 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:05:50.170151 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:50.170107 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:05:50.170684 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:05:50.170660 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:00.170584 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:00.170555 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:00.171051 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:00.170613 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:06.289162 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.289126 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6"] Apr 25 00:06:06.289638 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.289588 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" containerID="cri-o://71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373" gracePeriod=30 Apr 25 00:06:06.289699 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.289628 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" containerID="cri-o://59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1" gracePeriod=30 Apr 25 00:06:06.289746 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.289731 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" containerID="cri-o://76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934" gracePeriod=30 Apr 25 00:06:06.412525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.412497 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v"] Apr 25 00:06:06.415604 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.415584 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.417730 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.417702 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 25 00:06:06.417848 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.417805 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 25 00:06:06.424236 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.424213 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v"] Apr 25 00:06:06.462510 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.462483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnfs\" (UniqueName: \"kubernetes.io/projected/7d950b28-55b1-4d41-911c-d452a84c9863-kube-api-access-fnnfs\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.462637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.462514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d950b28-55b1-4d41-911c-d452a84c9863-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.462637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.462536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d950b28-55b1-4d41-911c-d452a84c9863-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.462637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.462567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d950b28-55b1-4d41-911c-d452a84c9863-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.563306 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.563233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnfs\" (UniqueName: \"kubernetes.io/projected/7d950b28-55b1-4d41-911c-d452a84c9863-kube-api-access-fnnfs\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.563306 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.563265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d950b28-55b1-4d41-911c-d452a84c9863-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.563306 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.563287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d950b28-55b1-4d41-911c-d452a84c9863-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.563545 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.563319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d950b28-55b1-4d41-911c-d452a84c9863-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.563719 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.563700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d950b28-55b1-4d41-911c-d452a84c9863-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.564030 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.564007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d950b28-55b1-4d41-911c-d452a84c9863-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.565852 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.565835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d950b28-55b1-4d41-911c-d452a84c9863-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.571881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.571861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnfs\" (UniqueName: \"kubernetes.io/projected/7d950b28-55b1-4d41-911c-d452a84c9863-kube-api-access-fnnfs\") pod \"isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.726600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.726567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:06.847869 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:06.847840 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v"] Apr 25 00:06:06.849951 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:06:06.849906 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d950b28_55b1_4d41_911c_d452a84c9863.slice/crio-655ae576a4831d3c387648464fc96046379a0103cb4100c24139c6d5991d7f0b WatchSource:0}: Error finding container 655ae576a4831d3c387648464fc96046379a0103cb4100c24139c6d5991d7f0b: Status 404 returned error can't find the container with id 655ae576a4831d3c387648464fc96046379a0103cb4100c24139c6d5991d7f0b Apr 25 00:06:07.355557 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:07.355525 2576 generic.go:358] "Generic (PLEG): container finished" podID="61e5ef93-f2a2-4790-959d-834c553d9929" containerID="76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934" exitCode=2 Apr 25 00:06:07.356003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:07.355605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerDied","Data":"76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934"} Apr 25 00:06:07.357001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:07.356978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerStarted","Data":"0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f"} Apr 25 00:06:07.357074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:07.357009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerStarted","Data":"655ae576a4831d3c387648464fc96046379a0103cb4100c24139c6d5991d7f0b"} Apr 25 00:06:09.167470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:09.167421 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 25 00:06:10.170014 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:10.169969 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:06:10.170366 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:10.170281 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:10.367359 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:10.367326 2576 generic.go:358] "Generic (PLEG): container finished" podID="61e5ef93-f2a2-4790-959d-834c553d9929" containerID="71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373" exitCode=0 Apr 25 00:06:10.367526 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:10.367366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerDied","Data":"71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373"} Apr 25 00:06:11.371518 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:11.371482 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d950b28-55b1-4d41-911c-d452a84c9863" containerID="0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f" exitCode=0 Apr 25 00:06:11.372011 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:11.371534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerDied","Data":"0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f"} Apr 25 00:06:12.376979 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.376946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerStarted","Data":"e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e"} Apr 25 00:06:12.377451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.376990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerStarted","Data":"019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3"} Apr 25 00:06:12.377451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.377003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerStarted","Data":"f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc"} Apr 25 00:06:12.377451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.377318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:12.377451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.377350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:12.377451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.377363 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:12.378694 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.378667 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:12.379412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.379389 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:12.397435 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:12.397397 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podStartSLOduration=6.397384915 podStartE2EDuration="6.397384915s" podCreationTimestamp="2026-04-25 00:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:06:12.395778345 +0000 UTC m=+732.676895266" watchObservedRunningTime="2026-04-25 00:06:12.397384915 +0000 UTC m=+732.678501835" Apr 25 00:06:13.380726 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:13.380681 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:13.381212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:13.381114 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:14.167746 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:14.167705 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 25 00:06:18.384429 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:18.384400 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:06:18.385061 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:18.385032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:18.385349 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:18.385321 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:19.167753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:19.167704 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 25 00:06:19.167963 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:19.167869 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:20.170663 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:20.170621 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:06:20.171096 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:20.170977 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:24.168169 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:24.168126 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 25 00:06:28.385040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:28.384997 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:28.385413 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:28.385355 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:29.168159 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:29.168115 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 25 00:06:30.170452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:30.170403 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 25 00:06:30.170893 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:30.170565 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:30.170893 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:30.170741 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:30.170893 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:30.170854 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:34.167682 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:34.167642 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 25 00:06:36.430045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.430017 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:36.448263 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.448238 2576 generic.go:358] "Generic (PLEG): container finished" podID="61e5ef93-f2a2-4790-959d-834c553d9929" containerID="59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1" exitCode=0 Apr 25 00:06:36.448379 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.448318 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" Apr 25 00:06:36.448379 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.448321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerDied","Data":"59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1"} Apr 25 00:06:36.448379 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.448368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6" event={"ID":"61e5ef93-f2a2-4790-959d-834c553d9929","Type":"ContainerDied","Data":"0e2322b4f6b94f77fcb6cf06d1d5f357857163c607632eba8cf6bd65d12d71da"} Apr 25 00:06:36.448534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.448389 2576 scope.go:117] "RemoveContainer" containerID="59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1" Apr 25 00:06:36.456958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.456941 2576 scope.go:117] "RemoveContainer" containerID="76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934" Apr 25 00:06:36.464402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.464387 2576 scope.go:117] "RemoveContainer" containerID="71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373" Apr 25 00:06:36.471047 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.471031 2576 scope.go:117] "RemoveContainer" containerID="27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669" Apr 25 00:06:36.477364 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.477348 2576 scope.go:117] "RemoveContainer" containerID="59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1" Apr 25 00:06:36.477602 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:06:36.477578 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1\": container with ID starting with 59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1 not found: ID does not exist" containerID="59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1" Apr 25 00:06:36.477678 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.477607 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1"} err="failed to get container status \"59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1\": rpc error: code = NotFound desc = could not find container \"59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1\": container with ID starting with 59eb124aa3b2d861dc7ecd5a7616dd452491873625066ba25bbc050379f2a6f1 not found: ID does not exist" Apr 25 00:06:36.477678 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.477624 2576 scope.go:117] "RemoveContainer" containerID="76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934" Apr 25 00:06:36.477835 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:06:36.477819 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934\": container with ID starting with 76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934 not found: ID does not exist" containerID="76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934" Apr 25 00:06:36.477873 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.477839 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934"} err="failed to get container status \"76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934\": rpc error: code = NotFound desc = could not find container \"76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934\": container with ID starting with 76fb22678ad9ec365363c7bbdd25a17df0a7a5f33243b1cf24e3706303f0d934 not found: ID does not exist" Apr 25 00:06:36.477873 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.477851 2576 scope.go:117] "RemoveContainer" containerID="71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373" Apr 25 00:06:36.478117 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:06:36.478100 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373\": container with ID starting with 71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373 not found: ID does not exist" containerID="71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373" Apr 25 00:06:36.478167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.478122 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373"} err="failed to get container status \"71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373\": rpc error: code = NotFound desc = could not find container \"71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373\": container with ID starting with 71553333d0928d01ce6a1256cd3350c681b066138c9b98c71a8d60b8c4878373 not found: ID does not exist" Apr 25 00:06:36.478167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.478137 2576 scope.go:117] "RemoveContainer" containerID="27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669" Apr 25 00:06:36.478416 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:06:36.478399 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669\": container with ID starting with 27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669 not found: ID does not exist" containerID="27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669" Apr 25 00:06:36.478459 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.478420 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669"} err="failed to get container status \"27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669\": rpc error: code = NotFound desc = could not find container \"27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669\": container with ID starting with 27c2a1e7aeb07914049e35a5c6e8bd6c0b397f8710779ab40daf34e259d1c669 not found: ID does not exist" Apr 25 00:06:36.492716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.492698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls\") pod \"61e5ef93-f2a2-4790-959d-834c553d9929\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " Apr 25 00:06:36.492799 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.492737 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvvv5\" (UniqueName: \"kubernetes.io/projected/61e5ef93-f2a2-4790-959d-834c553d9929-kube-api-access-hvvv5\") pod \"61e5ef93-f2a2-4790-959d-834c553d9929\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " Apr 25 00:06:36.492799 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.492773 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61e5ef93-f2a2-4790-959d-834c553d9929-kserve-provision-location\") pod \"61e5ef93-f2a2-4790-959d-834c553d9929\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " Apr 25 00:06:36.492799 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.492791 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61e5ef93-f2a2-4790-959d-834c553d9929-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"61e5ef93-f2a2-4790-959d-834c553d9929\" (UID: \"61e5ef93-f2a2-4790-959d-834c553d9929\") " Apr 25 00:06:36.493123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.493102 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61e5ef93-f2a2-4790-959d-834c553d9929-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "61e5ef93-f2a2-4790-959d-834c553d9929" (UID: "61e5ef93-f2a2-4790-959d-834c553d9929"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:06:36.493190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.493142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61e5ef93-f2a2-4790-959d-834c553d9929-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "61e5ef93-f2a2-4790-959d-834c553d9929" (UID: "61e5ef93-f2a2-4790-959d-834c553d9929"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:06:36.494990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.494965 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e5ef93-f2a2-4790-959d-834c553d9929-kube-api-access-hvvv5" (OuterVolumeSpecName: "kube-api-access-hvvv5") pod "61e5ef93-f2a2-4790-959d-834c553d9929" (UID: "61e5ef93-f2a2-4790-959d-834c553d9929"). InnerVolumeSpecName "kube-api-access-hvvv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:06:36.494990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.494974 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "61e5ef93-f2a2-4790-959d-834c553d9929" (UID: "61e5ef93-f2a2-4790-959d-834c553d9929"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:06:36.594255 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.594185 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61e5ef93-f2a2-4790-959d-834c553d9929-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:06:36.594255 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.594219 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hvvv5\" (UniqueName: \"kubernetes.io/projected/61e5ef93-f2a2-4790-959d-834c553d9929-kube-api-access-hvvv5\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:06:36.594255 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.594230 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61e5ef93-f2a2-4790-959d-834c553d9929-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:06:36.594255 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.594239 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/61e5ef93-f2a2-4790-959d-834c553d9929-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:06:36.770595 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.770570 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6"] Apr 25 00:06:36.775962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:36.775940 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-77df99677f-dd6m6"] Apr 25 00:06:38.315780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:38.315748 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" path="/var/lib/kubelet/pods/61e5ef93-f2a2-4790-959d-834c553d9929/volumes" Apr 25 00:06:38.385831 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:38.385783 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:38.386202 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:38.386177 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:48.385431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:48.385385 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:48.385990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:48.385963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:06:58.385150 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:58.385106 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:06:58.385539 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:06:58.385369 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:08.385139 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:08.385065 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:07:08.385588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:08.385417 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:18.385615 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:18.385587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:07:18.386022 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:18.385778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:07:31.465324 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.465289 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v"] Apr 25 00:07:31.465741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.465607 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" containerID="cri-o://f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc" gracePeriod=30 Apr 25 00:07:31.465741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.465671 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" containerID="cri-o://019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3" gracePeriod=30 Apr 25 00:07:31.465855 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.465656 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" containerID="cri-o://e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e" gracePeriod=30 Apr 25 00:07:31.531076 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt"] Apr 25 00:07:31.531376 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531363 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" Apr 25 00:07:31.531421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531378 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" Apr 25 00:07:31.531421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531389 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" Apr 25 00:07:31.531421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531395 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" Apr 25 00:07:31.531421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531402 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" Apr 25 00:07:31.531421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531408 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" Apr 25 00:07:31.531421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531421 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="storage-initializer" Apr 25 00:07:31.531591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531427 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="storage-initializer" Apr 25 00:07:31.531591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531469 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="agent" Apr 25 00:07:31.531591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531478 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kserve-container" Apr 25 00:07:31.531591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.531486 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="61e5ef93-f2a2-4790-959d-834c553d9929" containerName="kube-rbac-proxy" Apr 25 00:07:31.534430 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.534412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.536775 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.536751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 25 00:07:31.536890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.536780 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 25 00:07:31.542553 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.542525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt"] Apr 25 00:07:31.592793 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.592764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f946c51-a649-4893-b321-38d19306806d-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.592906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.592809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kj7\" (UniqueName: \"kubernetes.io/projected/5f946c51-a649-4893-b321-38d19306806d-kube-api-access-l2kj7\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.592906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.592835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.609845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.609817 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d950b28-55b1-4d41-911c-d452a84c9863" containerID="019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3" exitCode=2 Apr 25 00:07:31.609952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.609891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerDied","Data":"019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3"} Apr 25 00:07:31.694132 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.694107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kj7\" (UniqueName: \"kubernetes.io/projected/5f946c51-a649-4893-b321-38d19306806d-kube-api-access-l2kj7\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.694281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.694148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.694281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.694237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f946c51-a649-4893-b321-38d19306806d-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.694404 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:07:31.694338 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 25 00:07:31.694454 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:07:31.694421 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls podName:5f946c51-a649-4893-b321-38d19306806d nodeName:}" failed. No retries permitted until 2026-04-25 00:07:32.194398007 +0000 UTC m=+812.475514919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-5h8xt" (UID: "5f946c51-a649-4893-b321-38d19306806d") : secret "message-dumper-predictor-serving-cert" not found Apr 25 00:07:31.694801 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.694782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f946c51-a649-4893-b321-38d19306806d-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:31.704417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:31.704399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kj7\" (UniqueName: \"kubernetes.io/projected/5f946c51-a649-4893-b321-38d19306806d-kube-api-access-l2kj7\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:32.198191 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:32.198158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:32.200633 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:32.200615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-5h8xt\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:32.446040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:32.446007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:32.566406 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:32.566154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt"] Apr 25 00:07:32.569029 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:07:32.568999 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f946c51_a649_4893_b321_38d19306806d.slice/crio-1f73c22a7f96de305979de669079ea1bba2900fd685948c6042abb2b91921350 WatchSource:0}: Error finding container 1f73c22a7f96de305979de669079ea1bba2900fd685948c6042abb2b91921350: Status 404 returned error can't find the container with id 1f73c22a7f96de305979de669079ea1bba2900fd685948c6042abb2b91921350 Apr 25 00:07:32.613484 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:32.613448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" event={"ID":"5f946c51-a649-4893-b321-38d19306806d","Type":"ContainerStarted","Data":"1f73c22a7f96de305979de669079ea1bba2900fd685948c6042abb2b91921350"} Apr 25 00:07:33.381052 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:33.380997 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 25 00:07:34.620243 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:34.620203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" event={"ID":"5f946c51-a649-4893-b321-38d19306806d","Type":"ContainerStarted","Data":"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d"} Apr 25 00:07:34.620243 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:34.620249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" event={"ID":"5f946c51-a649-4893-b321-38d19306806d","Type":"ContainerStarted","Data":"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37"} Apr 25 00:07:34.620658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:34.620335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:34.620658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:34.620360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:34.621735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:34.621713 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:34.638231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:34.638178 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" podStartSLOduration=2.6185708 podStartE2EDuration="3.638167542s" podCreationTimestamp="2026-04-25 00:07:31 +0000 UTC" firstStartedPulling="2026-04-25 00:07:32.57082914 +0000 UTC m=+812.851946037" lastFinishedPulling="2026-04-25 00:07:33.590425868 +0000 UTC m=+813.871542779" observedRunningTime="2026-04-25 00:07:34.636124882 +0000 UTC m=+814.917241813" watchObservedRunningTime="2026-04-25 00:07:34.638167542 +0000 UTC m=+814.919284461" Apr 25 00:07:35.625202 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:35.625113 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d950b28-55b1-4d41-911c-d452a84c9863" containerID="f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc" exitCode=0 Apr 25 00:07:35.625202 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:35.625183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerDied","Data":"f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc"} Apr 25 00:07:38.381108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:38.381069 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 25 00:07:38.385375 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:38.385340 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:07:38.385705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:38.385686 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:41.632637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:41.632608 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:07:43.381336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:43.381293 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 25 00:07:43.381800 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:43.381411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:07:48.381218 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:48.381180 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 25 00:07:48.385536 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:48.385508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:07:48.385832 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:48.385809 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:51.587245 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.587205 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt"] Apr 25 00:07:51.590288 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.590266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.592768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.592747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 25 00:07:51.592892 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.592828 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 25 00:07:51.598050 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.597891 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt"] Apr 25 00:07:51.744427 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.744398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-proxy-tls\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.744632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.744435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kserve-provision-location\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.744632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.744457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkcbn\" (UniqueName: \"kubernetes.io/projected/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kube-api-access-rkcbn\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.744632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.744553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.845203 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.845122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kserve-provision-location\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.845203 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.845164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkcbn\" (UniqueName: \"kubernetes.io/projected/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kube-api-access-rkcbn\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.845203 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.845191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.845441 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.845255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-proxy-tls\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.845588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.845566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kserve-provision-location\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.845808 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.845789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.847789 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.847772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-proxy-tls\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.853266 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.853247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkcbn\" (UniqueName: \"kubernetes.io/projected/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kube-api-access-rkcbn\") pod \"isvc-logger-predictor-6f94d96d74-n2qpt\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:51.901399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:51.901373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:52.025481 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:52.025454 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt"] Apr 25 00:07:52.027989 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:07:52.027953 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffdc07f1_d385_4cdd_9f34_0025e3f8d0f3.slice/crio-f0a6394d2957a9f9aeca4136afff588021452ca09edbeb8fd1261099679b4bde WatchSource:0}: Error finding container f0a6394d2957a9f9aeca4136afff588021452ca09edbeb8fd1261099679b4bde: Status 404 returned error can't find the container with id f0a6394d2957a9f9aeca4136afff588021452ca09edbeb8fd1261099679b4bde Apr 25 00:07:52.672719 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:52.672682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerStarted","Data":"46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651"} Apr 25 00:07:52.672719 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:52.672722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerStarted","Data":"f0a6394d2957a9f9aeca4136afff588021452ca09edbeb8fd1261099679b4bde"} Apr 25 00:07:53.381115 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:53.381071 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 25 00:07:56.685576 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:56.685542 2576 generic.go:358] "Generic (PLEG): container finished" podID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerID="46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651" exitCode=0 Apr 25 00:07:56.686009 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:56.685617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerDied","Data":"46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651"} Apr 25 00:07:57.691064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.691030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerStarted","Data":"8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec"} Apr 25 00:07:57.691424 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.691073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerStarted","Data":"434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64"} Apr 25 00:07:57.691424 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.691087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerStarted","Data":"4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4"} Apr 25 00:07:57.691424 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.691389 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:57.691559 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.691519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:57.692609 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.692582 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:07:57.710008 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:57.709949 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podStartSLOduration=6.709938226 podStartE2EDuration="6.709938226s" podCreationTimestamp="2026-04-25 00:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:07:57.708048054 +0000 UTC m=+837.989164974" watchObservedRunningTime="2026-04-25 00:07:57.709938226 +0000 UTC m=+837.991055136" Apr 25 00:07:58.381594 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.381543 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 25 00:07:58.385022 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.384992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:5000: connect: connection refused" Apr 25 00:07:58.385146 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.385132 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:07:58.385259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.385235 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:58.385341 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.385330 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:07:58.694163 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.694083 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:07:58.694564 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.694180 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:07:58.695097 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:58.695075 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:07:59.697651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:59.697614 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:07:59.698095 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:07:59.697900 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:01.610890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.610854 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:08:01.704891 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.704858 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d950b28-55b1-4d41-911c-d452a84c9863" containerID="e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e" exitCode=0 Apr 25 00:08:01.705074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.704951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerDied","Data":"e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e"} Apr 25 00:08:01.705074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.704974 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" Apr 25 00:08:01.705074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.704996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v" event={"ID":"7d950b28-55b1-4d41-911c-d452a84c9863","Type":"ContainerDied","Data":"655ae576a4831d3c387648464fc96046379a0103cb4100c24139c6d5991d7f0b"} Apr 25 00:08:01.705074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.705017 2576 scope.go:117] "RemoveContainer" containerID="e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e" Apr 25 00:08:01.713941 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.713900 2576 scope.go:117] "RemoveContainer" containerID="019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3" Apr 25 00:08:01.721081 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.721066 2576 scope.go:117] "RemoveContainer" containerID="f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc" Apr 25 00:08:01.725121 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.725102 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d950b28-55b1-4d41-911c-d452a84c9863-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"7d950b28-55b1-4d41-911c-d452a84c9863\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " Apr 25 00:08:01.725232 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.725194 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnnfs\" (UniqueName: \"kubernetes.io/projected/7d950b28-55b1-4d41-911c-d452a84c9863-kube-api-access-fnnfs\") pod \"7d950b28-55b1-4d41-911c-d452a84c9863\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " Apr 25 00:08:01.725232 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.725223 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d950b28-55b1-4d41-911c-d452a84c9863-kserve-provision-location\") pod \"7d950b28-55b1-4d41-911c-d452a84c9863\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " Apr 25 00:08:01.725326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.725243 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d950b28-55b1-4d41-911c-d452a84c9863-proxy-tls\") pod \"7d950b28-55b1-4d41-911c-d452a84c9863\" (UID: \"7d950b28-55b1-4d41-911c-d452a84c9863\") " Apr 25 00:08:01.725539 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.725509 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d950b28-55b1-4d41-911c-d452a84c9863-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "7d950b28-55b1-4d41-911c-d452a84c9863" (UID: "7d950b28-55b1-4d41-911c-d452a84c9863"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:08:01.725627 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.725527 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d950b28-55b1-4d41-911c-d452a84c9863-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d950b28-55b1-4d41-911c-d452a84c9863" (UID: "7d950b28-55b1-4d41-911c-d452a84c9863"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:08:01.727311 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.727283 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d950b28-55b1-4d41-911c-d452a84c9863-kube-api-access-fnnfs" (OuterVolumeSpecName: "kube-api-access-fnnfs") pod "7d950b28-55b1-4d41-911c-d452a84c9863" (UID: "7d950b28-55b1-4d41-911c-d452a84c9863"). InnerVolumeSpecName "kube-api-access-fnnfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:08:01.727414 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.727372 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d950b28-55b1-4d41-911c-d452a84c9863-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7d950b28-55b1-4d41-911c-d452a84c9863" (UID: "7d950b28-55b1-4d41-911c-d452a84c9863"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:08:01.728481 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.728464 2576 scope.go:117] "RemoveContainer" containerID="0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f" Apr 25 00:08:01.739616 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.739587 2576 scope.go:117] "RemoveContainer" containerID="e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e" Apr 25 00:08:01.739844 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:08:01.739827 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e\": container with ID starting with e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e not found: ID does not exist" containerID="e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e" Apr 25 00:08:01.739899 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.739853 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e"} err="failed to get container status \"e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e\": rpc error: code = NotFound desc = could not find container \"e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e\": container with ID starting with e1a6c34daa9becaf587c2aa12426fa100c01e85b6f643885a8d563a57587e00e not found: ID does not exist" Apr 25 00:08:01.739899 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.739873 2576 scope.go:117] "RemoveContainer" containerID="019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3" Apr 25 00:08:01.740126 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:08:01.740105 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3\": container with ID starting with 019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3 not found: ID does not exist" containerID="019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3" Apr 25 00:08:01.740188 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.740137 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3"} err="failed to get container status \"019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3\": rpc error: code = NotFound desc = could not find container \"019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3\": container with ID starting with 019602ad0b6036fbda88d9475413e217853a0be5d8190ff57c1b88d4b2f728d3 not found: ID does not exist" Apr 25 00:08:01.740188 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.740160 2576 scope.go:117] "RemoveContainer" containerID="f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc" Apr 25 00:08:01.740414 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:08:01.740396 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc\": container with ID starting with f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc not found: ID does not exist" containerID="f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc" Apr 25 00:08:01.740453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.740421 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc"} err="failed to get container status \"f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc\": rpc error: code = NotFound desc = could not find container \"f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc\": container with ID starting with f88b666dc1b6466be737ab08e35e4d753c2c2a55f99fbaf3445b909a48542afc not found: ID does not exist" Apr 25 00:08:01.740453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.740437 2576 scope.go:117] "RemoveContainer" containerID="0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f" Apr 25 00:08:01.740653 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:08:01.740638 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f\": container with ID starting with 0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f not found: ID does not exist" containerID="0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f" Apr 25 00:08:01.740692 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.740657 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f"} err="failed to get container status \"0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f\": rpc error: code = NotFound desc = could not find container \"0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f\": container with ID starting with 0411273e6b2adf8100f30d5eaa19bb8081c2048b004e9fa665b181909fd98a4f not found: ID does not exist" Apr 25 00:08:01.825940 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.825883 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnnfs\" (UniqueName: \"kubernetes.io/projected/7d950b28-55b1-4d41-911c-d452a84c9863-kube-api-access-fnnfs\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:08:01.825940 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.825932 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d950b28-55b1-4d41-911c-d452a84c9863-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:08:01.825940 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.825945 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d950b28-55b1-4d41-911c-d452a84c9863-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:08:01.826152 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:01.825956 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d950b28-55b1-4d41-911c-d452a84c9863-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:08:02.028198 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:02.028172 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v"] Apr 25 00:08:02.034514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:02.034491 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-fbfbfccd6-jdq9v"] Apr 25 00:08:02.315518 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:02.315481 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" path="/var/lib/kubelet/pods/7d950b28-55b1-4d41-911c-d452a84c9863/volumes" Apr 25 00:08:04.701504 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:04.701479 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:08:04.702071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:04.702038 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:08:04.702376 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:04.702349 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:14.702605 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:14.702560 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:08:14.703141 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:14.703039 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:24.702137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:24.702096 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:08:24.702605 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:24.702498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:34.702421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:34.702372 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:08:34.702801 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:34.702731 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:44.702324 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:44.702279 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:08:44.702783 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:44.702714 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:08:54.702772 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:54.702730 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:08:54.703277 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:08:54.703202 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:00.229630 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:00.229598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:09:00.231877 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:00.231856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:09:04.702747 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:04.702719 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:04.703204 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:04.703181 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:16.605724 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.605689 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-5h8xt_5f946c51-a649-4893-b321-38d19306806d/kserve-container/0.log" Apr 25 00:09:16.768296 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.768267 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt"] Apr 25 00:09:16.768708 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.768650 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" containerID="cri-o://4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4" gracePeriod=30 Apr 25 00:09:16.768708 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.768677 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" containerID="cri-o://434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64" gracePeriod=30 Apr 25 00:09:16.768708 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.768677 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" containerID="cri-o://8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec" gracePeriod=30 Apr 25 00:09:16.813744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.813715 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk"] Apr 25 00:09:16.814061 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814047 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814063 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814074 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814080 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814091 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="storage-initializer" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814096 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="storage-initializer" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814103 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" Apr 25 00:09:16.814113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814109 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" Apr 25 00:09:16.814319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814172 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kserve-container" Apr 25 00:09:16.814319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814180 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="kube-rbac-proxy" Apr 25 00:09:16.814319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.814188 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d950b28-55b1-4d41-911c-d452a84c9863" containerName="agent" Apr 25 00:09:16.817364 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.817348 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:16.819478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.819453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 25 00:09:16.819575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.819549 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 25 00:09:16.827566 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.827544 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk"] Apr 25 00:09:16.877906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.877841 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt"] Apr 25 00:09:16.878249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.878191 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kserve-container" containerID="cri-o://05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37" gracePeriod=30 Apr 25 00:09:16.878473 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.878215 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kube-rbac-proxy" containerID="cri-o://900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d" gracePeriod=30 Apr 25 00:09:16.925095 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.925069 2576 generic.go:358] "Generic (PLEG): container finished" podID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerID="434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64" exitCode=2 Apr 25 00:09:16.925183 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.925136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerDied","Data":"434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64"} Apr 25 00:09:16.962449 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.962421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fafeb93c-6003-480c-a81f-995c509a2189-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:16.962548 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.962457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c54jq\" (UniqueName: \"kubernetes.io/projected/fafeb93c-6003-480c-a81f-995c509a2189-kube-api-access-c54jq\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:16.962548 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.962493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fafeb93c-6003-480c-a81f-995c509a2189-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:16.962646 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:16.962552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fafeb93c-6003-480c-a81f-995c509a2189-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.063485 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.063454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fafeb93c-6003-480c-a81f-995c509a2189-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.063639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.063510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c54jq\" (UniqueName: \"kubernetes.io/projected/fafeb93c-6003-480c-a81f-995c509a2189-kube-api-access-c54jq\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.063639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.063570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fafeb93c-6003-480c-a81f-995c509a2189-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.063639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.063608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fafeb93c-6003-480c-a81f-995c509a2189-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.064196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.064170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fafeb93c-6003-480c-a81f-995c509a2189-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.064373 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.064345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fafeb93c-6003-480c-a81f-995c509a2189-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.066577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.066553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fafeb93c-6003-480c-a81f-995c509a2189-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.081661 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.081635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c54jq\" (UniqueName: \"kubernetes.io/projected/fafeb93c-6003-480c-a81f-995c509a2189-kube-api-access-c54jq\") pod \"isvc-lightgbm-predictor-bdf964bd-vvmhk\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.111655 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.111636 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:09:17.131456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.131407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:17.253651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.253535 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk"] Apr 25 00:09:17.256333 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:09:17.256305 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfafeb93c_6003_480c_a81f_995c509a2189.slice/crio-b0fa97fb9ed4ac196a0b81e9f91257bcaa3d887c7efe38b7a9baf9d105c9c989 WatchSource:0}: Error finding container b0fa97fb9ed4ac196a0b81e9f91257bcaa3d887c7efe38b7a9baf9d105c9c989: Status 404 returned error can't find the container with id b0fa97fb9ed4ac196a0b81e9f91257bcaa3d887c7efe38b7a9baf9d105c9c989 Apr 25 00:09:17.265382 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.265361 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f946c51-a649-4893-b321-38d19306806d-message-dumper-kube-rbac-proxy-sar-config\") pod \"5f946c51-a649-4893-b321-38d19306806d\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " Apr 25 00:09:17.265477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.265408 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls\") pod \"5f946c51-a649-4893-b321-38d19306806d\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " Apr 25 00:09:17.265477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.265445 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kj7\" (UniqueName: \"kubernetes.io/projected/5f946c51-a649-4893-b321-38d19306806d-kube-api-access-l2kj7\") pod \"5f946c51-a649-4893-b321-38d19306806d\" (UID: \"5f946c51-a649-4893-b321-38d19306806d\") " Apr 25 00:09:17.265695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.265675 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f946c51-a649-4893-b321-38d19306806d-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "5f946c51-a649-4893-b321-38d19306806d" (UID: "5f946c51-a649-4893-b321-38d19306806d"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:09:17.267506 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.267484 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5f946c51-a649-4893-b321-38d19306806d" (UID: "5f946c51-a649-4893-b321-38d19306806d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:17.267709 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.267686 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f946c51-a649-4893-b321-38d19306806d-kube-api-access-l2kj7" (OuterVolumeSpecName: "kube-api-access-l2kj7") pod "5f946c51-a649-4893-b321-38d19306806d" (UID: "5f946c51-a649-4893-b321-38d19306806d"). InnerVolumeSpecName "kube-api-access-l2kj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:17.365978 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.365943 2576 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5f946c51-a649-4893-b321-38d19306806d-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:17.365978 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.365971 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5f946c51-a649-4893-b321-38d19306806d-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:17.365978 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.365984 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2kj7\" (UniqueName: \"kubernetes.io/projected/5f946c51-a649-4893-b321-38d19306806d-kube-api-access-l2kj7\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:17.929093 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929062 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f946c51-a649-4893-b321-38d19306806d" containerID="900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d" exitCode=2 Apr 25 00:09:17.929093 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929089 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f946c51-a649-4893-b321-38d19306806d" containerID="05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37" exitCode=2 Apr 25 00:09:17.929533 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929129 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" Apr 25 00:09:17.929533 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" event={"ID":"5f946c51-a649-4893-b321-38d19306806d","Type":"ContainerDied","Data":"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d"} Apr 25 00:09:17.929533 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" event={"ID":"5f946c51-a649-4893-b321-38d19306806d","Type":"ContainerDied","Data":"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37"} Apr 25 00:09:17.929533 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt" event={"ID":"5f946c51-a649-4893-b321-38d19306806d","Type":"ContainerDied","Data":"1f73c22a7f96de305979de669079ea1bba2900fd685948c6042abb2b91921350"} Apr 25 00:09:17.929533 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.929235 2576 scope.go:117] "RemoveContainer" containerID="900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d" Apr 25 00:09:17.930636 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.930613 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerStarted","Data":"17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934"} Apr 25 00:09:17.930754 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.930643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerStarted","Data":"b0fa97fb9ed4ac196a0b81e9f91257bcaa3d887c7efe38b7a9baf9d105c9c989"} Apr 25 00:09:17.937099 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.937078 2576 scope.go:117] "RemoveContainer" containerID="05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37" Apr 25 00:09:17.946013 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.945515 2576 scope.go:117] "RemoveContainer" containerID="900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d" Apr 25 00:09:17.946071 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:09:17.946008 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d\": container with ID starting with 900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d not found: ID does not exist" containerID="900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d" Apr 25 00:09:17.946071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.946039 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d"} err="failed to get container status \"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d\": rpc error: code = NotFound desc = could not find container \"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d\": container with ID starting with 900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d not found: ID does not exist" Apr 25 00:09:17.946071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.946062 2576 scope.go:117] "RemoveContainer" containerID="05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37" Apr 25 00:09:17.946484 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:09:17.946465 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37\": container with ID starting with 05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37 not found: ID does not exist" containerID="05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37" Apr 25 00:09:17.946560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.946492 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37"} err="failed to get container status \"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37\": rpc error: code = NotFound desc = could not find container \"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37\": container with ID starting with 05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37 not found: ID does not exist" Apr 25 00:09:17.946560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.946516 2576 scope.go:117] "RemoveContainer" containerID="900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d" Apr 25 00:09:17.946769 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.946748 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d"} err="failed to get container status \"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d\": rpc error: code = NotFound desc = could not find container \"900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d\": container with ID starting with 900799dc5e498c809cf42764b9eaa4e3d49c06233102f5d1294b873781d0f21d not found: ID does not exist" Apr 25 00:09:17.946825 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.946770 2576 scope.go:117] "RemoveContainer" containerID="05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37" Apr 25 00:09:17.947022 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.947002 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37"} err="failed to get container status \"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37\": rpc error: code = NotFound desc = could not find container \"05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37\": container with ID starting with 05c41b81c11afa3e1b80184aeef9b87c549192c838cb8e63c57054ab63d0bd37 not found: ID does not exist" Apr 25 00:09:17.963280 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.963249 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt"] Apr 25 00:09:17.965733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:17.965709 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-5h8xt"] Apr 25 00:09:18.315703 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:18.315673 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f946c51-a649-4893-b321-38d19306806d" path="/var/lib/kubelet/pods/5f946c51-a649-4893-b321-38d19306806d/volumes" Apr 25 00:09:19.698260 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:19.698207 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 25 00:09:20.941627 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:20.941546 2576 generic.go:358] "Generic (PLEG): container finished" podID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerID="4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4" exitCode=0 Apr 25 00:09:20.941981 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:20.941621 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerDied","Data":"4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4"} Apr 25 00:09:21.946080 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:21.946047 2576 generic.go:358] "Generic (PLEG): container finished" podID="fafeb93c-6003-480c-a81f-995c509a2189" containerID="17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934" exitCode=0 Apr 25 00:09:21.946563 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:21.946128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerDied","Data":"17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934"} Apr 25 00:09:24.697735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:24.697685 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 25 00:09:24.702530 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:24.702487 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:24.703529 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:24.703488 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:09:28.970580 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:28.970547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerStarted","Data":"211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b"} Apr 25 00:09:28.970996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:28.970586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerStarted","Data":"24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746"} Apr 25 00:09:28.970996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:28.970794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:28.992045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:28.991984 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podStartSLOduration=6.715997071 podStartE2EDuration="12.991967702s" podCreationTimestamp="2026-04-25 00:09:16 +0000 UTC" firstStartedPulling="2026-04-25 00:09:21.947505295 +0000 UTC m=+922.228622192" lastFinishedPulling="2026-04-25 00:09:28.223475908 +0000 UTC m=+928.504592823" observedRunningTime="2026-04-25 00:09:28.990338862 +0000 UTC m=+929.271455785" watchObservedRunningTime="2026-04-25 00:09:28.991967702 +0000 UTC m=+929.273084628" Apr 25 00:09:29.698164 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:29.698121 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 25 00:09:29.698334 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:29.698271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:29.973883 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:29.973801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:29.974896 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:29.974871 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:09:30.976399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:30.976356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:09:34.698139 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:34.698091 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 25 00:09:34.702824 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:34.702790 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:34.703861 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:34.703833 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:09:35.980390 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:35.980362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:09:35.981039 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:35.981003 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:09:39.697956 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:39.697894 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 25 00:09:44.698564 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:44.698522 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 25 00:09:44.703163 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:44.703127 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 25 00:09:44.703314 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:44.703257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:44.703314 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:44.703250 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 25 00:09:44.703458 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:44.703443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:45.980891 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:45.980855 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:09:46.916519 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:46.916498 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:47.027689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.027619 2576 generic.go:358] "Generic (PLEG): container finished" podID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerID="8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec" exitCode=0 Apr 25 00:09:47.028097 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.027681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerDied","Data":"8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec"} Apr 25 00:09:47.028097 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.027716 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" Apr 25 00:09:47.028097 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.027729 2576 scope.go:117] "RemoveContainer" containerID="8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec" Apr 25 00:09:47.028097 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.027719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt" event={"ID":"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3","Type":"ContainerDied","Data":"f0a6394d2957a9f9aeca4136afff588021452ca09edbeb8fd1261099679b4bde"} Apr 25 00:09:47.035103 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.035087 2576 scope.go:117] "RemoveContainer" containerID="434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64" Apr 25 00:09:47.042227 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.042212 2576 scope.go:117] "RemoveContainer" containerID="4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4" Apr 25 00:09:47.048848 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.048833 2576 scope.go:117] "RemoveContainer" containerID="46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651" Apr 25 00:09:47.055397 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.055382 2576 scope.go:117] "RemoveContainer" containerID="8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec" Apr 25 00:09:47.055666 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:09:47.055648 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec\": container with ID starting with 8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec not found: ID does not exist" containerID="8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec" Apr 25 00:09:47.055722 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.055676 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec"} err="failed to get container status \"8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec\": rpc error: code = NotFound desc = could not find container \"8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec\": container with ID starting with 8d46b8cdd4e09e9ddee914abc46e165f62907155bb03692b45b4c57f2b39a2ec not found: ID does not exist" Apr 25 00:09:47.055722 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.055695 2576 scope.go:117] "RemoveContainer" containerID="434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64" Apr 25 00:09:47.055960 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:09:47.055940 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64\": container with ID starting with 434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64 not found: ID does not exist" containerID="434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64" Apr 25 00:09:47.056018 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.055966 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64"} err="failed to get container status \"434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64\": rpc error: code = NotFound desc = could not find container \"434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64\": container with ID starting with 434016ca3df43aa3b11022478909e095cc05d1ddea451d71dd45d6c0a15dbd64 not found: ID does not exist" Apr 25 00:09:47.056018 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.055982 2576 scope.go:117] "RemoveContainer" containerID="4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4" Apr 25 00:09:47.056232 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:09:47.056216 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4\": container with ID starting with 4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4 not found: ID does not exist" containerID="4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4" Apr 25 00:09:47.056277 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.056234 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4"} err="failed to get container status \"4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4\": rpc error: code = NotFound desc = could not find container \"4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4\": container with ID starting with 4e2cb21e2257ace303193b69ffda1222f01db9c799e119ad97e7f1c44688c7f4 not found: ID does not exist" Apr 25 00:09:47.056277 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.056246 2576 scope.go:117] "RemoveContainer" containerID="46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651" Apr 25 00:09:47.056465 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:09:47.056448 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651\": container with ID starting with 46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651 not found: ID does not exist" containerID="46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651" Apr 25 00:09:47.056507 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.056470 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651"} err="failed to get container status \"46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651\": rpc error: code = NotFound desc = could not find container \"46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651\": container with ID starting with 46eb01bdf595d672a43205f217cc1dfaa388524703d45f55b422efbe0ce37651 not found: ID does not exist" Apr 25 00:09:47.095057 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.095036 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-proxy-tls\") pod \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " Apr 25 00:09:47.095163 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.095091 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-isvc-logger-kube-rbac-proxy-sar-config\") pod \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " Apr 25 00:09:47.095163 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.095121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kserve-provision-location\") pod \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " Apr 25 00:09:47.095271 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.095169 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkcbn\" (UniqueName: \"kubernetes.io/projected/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kube-api-access-rkcbn\") pod \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\" (UID: \"ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3\") " Apr 25 00:09:47.095470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.095444 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" (UID: "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:47.095470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.095456 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" (UID: "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:09:47.097151 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.097131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" (UID: "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:47.097313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.097296 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kube-api-access-rkcbn" (OuterVolumeSpecName: "kube-api-access-rkcbn") pod "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" (UID: "ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3"). InnerVolumeSpecName "kube-api-access-rkcbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:47.196076 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.196041 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:47.196076 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.196072 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:47.196076 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.196082 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rkcbn\" (UniqueName: \"kubernetes.io/projected/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-kube-api-access-rkcbn\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:47.196325 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.196091 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:09:47.348798 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.348767 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt"] Apr 25 00:09:47.354283 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:47.354261 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6f94d96d74-n2qpt"] Apr 25 00:09:48.315269 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:48.315236 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" path="/var/lib/kubelet/pods/ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3/volumes" Apr 25 00:09:55.981733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:09:55.981684 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:10:05.981772 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:05.981734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:10:15.981906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:15.981866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:10:25.981078 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:25.981032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:10:35.980848 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:35.980808 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 25 00:10:45.982134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:45.982098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:10:46.911661 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:46.911628 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk"] Apr 25 00:10:46.911959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:46.911932 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" containerID="cri-o://24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746" gracePeriod=30 Apr 25 00:10:46.912045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:46.911980 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kube-rbac-proxy" containerID="cri-o://211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b" gracePeriod=30 Apr 25 00:10:47.059766 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.059736 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx"] Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060015 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="storage-initializer" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060027 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="storage-initializer" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060041 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060046 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060056 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kube-rbac-proxy" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060062 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kube-rbac-proxy" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060068 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060073 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060080 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060085 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060095 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kserve-container" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060099 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kserve-container" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060141 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="agent" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060149 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kube-rbac-proxy" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060155 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kube-rbac-proxy" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060161 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ffdc07f1-d385-4cdd-9f34-0025e3f8d0f3" containerName="kserve-container" Apr 25 00:10:47.060171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.060166 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f946c51-a649-4893-b321-38d19306806d" containerName="kserve-container" Apr 25 00:10:47.062993 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.062977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.065418 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.065398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 25 00:10:47.065511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.065400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:10:47.072960 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.072936 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx"] Apr 25 00:10:47.116365 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.116333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmxzd\" (UniqueName: \"kubernetes.io/projected/65aeadf0-cf9e-4654-a346-be0d86e992bf-kube-api-access-lmxzd\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.116529 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.116375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65aeadf0-cf9e-4654-a346-be0d86e992bf-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.116529 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.116412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65aeadf0-cf9e-4654-a346-be0d86e992bf-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.116529 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.116458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65aeadf0-cf9e-4654-a346-be0d86e992bf-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.204817 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.204738 2576 generic.go:358] "Generic (PLEG): container finished" podID="fafeb93c-6003-480c-a81f-995c509a2189" containerID="211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b" exitCode=2 Apr 25 00:10:47.204968 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.204811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerDied","Data":"211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b"} Apr 25 00:10:47.217071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.217053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmxzd\" (UniqueName: \"kubernetes.io/projected/65aeadf0-cf9e-4654-a346-be0d86e992bf-kube-api-access-lmxzd\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.217145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.217080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65aeadf0-cf9e-4654-a346-be0d86e992bf-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.217145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.217107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65aeadf0-cf9e-4654-a346-be0d86e992bf-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.217214 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.217146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65aeadf0-cf9e-4654-a346-be0d86e992bf-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.217437 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.217418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65aeadf0-cf9e-4654-a346-be0d86e992bf-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.217797 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.217773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65aeadf0-cf9e-4654-a346-be0d86e992bf-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.219717 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.219696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65aeadf0-cf9e-4654-a346-be0d86e992bf-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.225132 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.225110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmxzd\" (UniqueName: \"kubernetes.io/projected/65aeadf0-cf9e-4654-a346-be0d86e992bf-kube-api-access-lmxzd\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.373393 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.373346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:47.492750 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.492726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx"] Apr 25 00:10:47.495159 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:10:47.495129 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65aeadf0_cf9e_4654_a346_be0d86e992bf.slice/crio-ecff9717d290736d7eb25cdc577af4a4f0a7e22be65fd20c445f3c7a66627332 WatchSource:0}: Error finding container ecff9717d290736d7eb25cdc577af4a4f0a7e22be65fd20c445f3c7a66627332: Status 404 returned error can't find the container with id ecff9717d290736d7eb25cdc577af4a4f0a7e22be65fd20c445f3c7a66627332 Apr 25 00:10:47.496817 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:47.496801 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:10:48.208523 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:48.208490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerStarted","Data":"782c3ab403b76ebd38be5699774a3507aa9b49ca8ad57f242ac7c005123abc3b"} Apr 25 00:10:48.208523 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:48.208525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerStarted","Data":"ecff9717d290736d7eb25cdc577af4a4f0a7e22be65fd20c445f3c7a66627332"} Apr 25 00:10:50.976684 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:50.976643 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 25 00:10:51.642888 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.642866 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:10:51.747961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.747864 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c54jq\" (UniqueName: \"kubernetes.io/projected/fafeb93c-6003-480c-a81f-995c509a2189-kube-api-access-c54jq\") pod \"fafeb93c-6003-480c-a81f-995c509a2189\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " Apr 25 00:10:51.747961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.747910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fafeb93c-6003-480c-a81f-995c509a2189-kserve-provision-location\") pod \"fafeb93c-6003-480c-a81f-995c509a2189\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " Apr 25 00:10:51.748164 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.747973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fafeb93c-6003-480c-a81f-995c509a2189-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"fafeb93c-6003-480c-a81f-995c509a2189\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " Apr 25 00:10:51.748164 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.748034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fafeb93c-6003-480c-a81f-995c509a2189-proxy-tls\") pod \"fafeb93c-6003-480c-a81f-995c509a2189\" (UID: \"fafeb93c-6003-480c-a81f-995c509a2189\") " Apr 25 00:10:51.748282 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.748261 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fafeb93c-6003-480c-a81f-995c509a2189-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fafeb93c-6003-480c-a81f-995c509a2189" (UID: "fafeb93c-6003-480c-a81f-995c509a2189"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:10:51.748333 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.748292 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fafeb93c-6003-480c-a81f-995c509a2189-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "fafeb93c-6003-480c-a81f-995c509a2189" (UID: "fafeb93c-6003-480c-a81f-995c509a2189"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:10:51.750092 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.750066 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fafeb93c-6003-480c-a81f-995c509a2189-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fafeb93c-6003-480c-a81f-995c509a2189" (UID: "fafeb93c-6003-480c-a81f-995c509a2189"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:10:51.750158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.750093 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fafeb93c-6003-480c-a81f-995c509a2189-kube-api-access-c54jq" (OuterVolumeSpecName: "kube-api-access-c54jq") pod "fafeb93c-6003-480c-a81f-995c509a2189" (UID: "fafeb93c-6003-480c-a81f-995c509a2189"). InnerVolumeSpecName "kube-api-access-c54jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:10:51.849407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.849368 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c54jq\" (UniqueName: \"kubernetes.io/projected/fafeb93c-6003-480c-a81f-995c509a2189-kube-api-access-c54jq\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:10:51.849407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.849403 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fafeb93c-6003-480c-a81f-995c509a2189-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:10:51.849554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.849419 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fafeb93c-6003-480c-a81f-995c509a2189-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:10:51.849554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:51.849435 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fafeb93c-6003-480c-a81f-995c509a2189-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:10:52.223238 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.223203 2576 generic.go:358] "Generic (PLEG): container finished" podID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerID="782c3ab403b76ebd38be5699774a3507aa9b49ca8ad57f242ac7c005123abc3b" exitCode=0 Apr 25 00:10:52.223632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.223277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerDied","Data":"782c3ab403b76ebd38be5699774a3507aa9b49ca8ad57f242ac7c005123abc3b"} Apr 25 00:10:52.225117 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.225098 2576 generic.go:358] "Generic (PLEG): container finished" podID="fafeb93c-6003-480c-a81f-995c509a2189" containerID="24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746" exitCode=0 Apr 25 00:10:52.225223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.225156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerDied","Data":"24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746"} Apr 25 00:10:52.225223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.225175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" event={"ID":"fafeb93c-6003-480c-a81f-995c509a2189","Type":"ContainerDied","Data":"b0fa97fb9ed4ac196a0b81e9f91257bcaa3d887c7efe38b7a9baf9d105c9c989"} Apr 25 00:10:52.225223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.225191 2576 scope.go:117] "RemoveContainer" containerID="211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b" Apr 25 00:10:52.225223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.225158 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk" Apr 25 00:10:52.233040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.232980 2576 scope.go:117] "RemoveContainer" containerID="24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746" Apr 25 00:10:52.239812 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.239790 2576 scope.go:117] "RemoveContainer" containerID="17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934" Apr 25 00:10:52.250795 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.250777 2576 scope.go:117] "RemoveContainer" containerID="211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b" Apr 25 00:10:52.251099 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:10:52.251072 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b\": container with ID starting with 211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b not found: ID does not exist" containerID="211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b" Apr 25 00:10:52.251212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.251109 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b"} err="failed to get container status \"211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b\": rpc error: code = NotFound desc = could not find container \"211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b\": container with ID starting with 211585fa53a092c0ce42ed81a095f7e4645889000d9bd46a9efbc8a32278420b not found: ID does not exist" Apr 25 00:10:52.251212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.251132 2576 scope.go:117] "RemoveContainer" containerID="24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746" Apr 25 00:10:52.251398 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:10:52.251375 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746\": container with ID starting with 24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746 not found: ID does not exist" containerID="24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746" Apr 25 00:10:52.251440 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.251404 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746"} err="failed to get container status \"24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746\": rpc error: code = NotFound desc = could not find container \"24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746\": container with ID starting with 24f0c3523536f0b5abacb8ad4002c076ee2ce87f64700e3b4d49a64b8291d746 not found: ID does not exist" Apr 25 00:10:52.251440 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.251421 2576 scope.go:117] "RemoveContainer" containerID="17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934" Apr 25 00:10:52.251642 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:10:52.251622 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934\": container with ID starting with 17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934 not found: ID does not exist" containerID="17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934" Apr 25 00:10:52.251699 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.251650 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934"} err="failed to get container status \"17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934\": rpc error: code = NotFound desc = could not find container \"17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934\": container with ID starting with 17ebe592c64224b5cefafae6a818d60ef9c17cecfd6a3c906c0d4b2a8b498934 not found: ID does not exist" Apr 25 00:10:52.254985 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.254964 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk"] Apr 25 00:10:52.260574 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.260556 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-vvmhk"] Apr 25 00:10:52.315734 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:52.315709 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fafeb93c-6003-480c-a81f-995c509a2189" path="/var/lib/kubelet/pods/fafeb93c-6003-480c-a81f-995c509a2189/volumes" Apr 25 00:10:53.233467 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:53.233426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerStarted","Data":"c4d3a99c13528532c91686cbae8ff7ce7d6d2408e6d8195d7de6ba26d17d2510"} Apr 25 00:10:53.233467 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:53.233470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerStarted","Data":"1bd3ca611e3b347902baddcaa80d0efbd45fa5327a84b14d8e0f9224f94c0293"} Apr 25 00:10:53.233999 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:53.233775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:53.233999 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:53.233896 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:53.235098 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:53.235073 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:10:53.251966 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:53.251897 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podStartSLOduration=6.251884866 podStartE2EDuration="6.251884866s" podCreationTimestamp="2026-04-25 00:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:10:53.25036454 +0000 UTC m=+1013.531481461" watchObservedRunningTime="2026-04-25 00:10:53.251884866 +0000 UTC m=+1013.533001827" Apr 25 00:10:54.237752 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:54.237715 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:10:59.241858 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:59.241832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:10:59.242428 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:10:59.242403 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:11:09.243145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:11:09.243100 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:11:19.242372 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:11:19.242327 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:11:29.242360 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:11:29.242313 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:11:39.243115 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:11:39.243076 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:11:49.242492 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:11:49.242453 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:11:59.242555 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:11:59.242512 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:12:09.243079 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:09.243047 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:12:17.628356 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.628326 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx"] Apr 25 00:12:17.628718 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.628644 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" containerID="cri-o://1bd3ca611e3b347902baddcaa80d0efbd45fa5327a84b14d8e0f9224f94c0293" gracePeriod=30 Apr 25 00:12:17.628718 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.628672 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kube-rbac-proxy" containerID="cri-o://c4d3a99c13528532c91686cbae8ff7ce7d6d2408e6d8195d7de6ba26d17d2510" gracePeriod=30 Apr 25 00:12:17.746967 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.746942 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn"] Apr 25 00:12:17.747230 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747216 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kube-rbac-proxy" Apr 25 00:12:17.747272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747236 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kube-rbac-proxy" Apr 25 00:12:17.747272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747248 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="storage-initializer" Apr 25 00:12:17.747272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747253 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="storage-initializer" Apr 25 00:12:17.747272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747265 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" Apr 25 00:12:17.747272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747271 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" Apr 25 00:12:17.747434 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747316 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kserve-container" Apr 25 00:12:17.747434 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.747327 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fafeb93c-6003-480c-a81f-995c509a2189" containerName="kube-rbac-proxy" Apr 25 00:12:17.750335 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.750316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.752600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.752570 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 25 00:12:17.752773 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.752757 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:12:17.758996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.758975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn"] Apr 25 00:12:17.869801 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.869764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcxh\" (UniqueName: \"kubernetes.io/projected/a8cf2bda-3f86-434a-8397-0aae6d811109-kube-api-access-ftcxh\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.869801 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.869805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8cf2bda-3f86-434a-8397-0aae6d811109-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.870014 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.869828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cf2bda-3f86-434a-8397-0aae6d811109-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.870014 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.869850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8cf2bda-3f86-434a-8397-0aae6d811109-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.970722 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.970646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcxh\" (UniqueName: \"kubernetes.io/projected/a8cf2bda-3f86-434a-8397-0aae6d811109-kube-api-access-ftcxh\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.970722 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.970688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8cf2bda-3f86-434a-8397-0aae6d811109-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.970935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.970721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cf2bda-3f86-434a-8397-0aae6d811109-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.970935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.970751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8cf2bda-3f86-434a-8397-0aae6d811109-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.971186 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.971165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cf2bda-3f86-434a-8397-0aae6d811109-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.971471 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.971450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8cf2bda-3f86-434a-8397-0aae6d811109-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.973243 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.973226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8cf2bda-3f86-434a-8397-0aae6d811109-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:17.978937 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:17.978256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcxh\" (UniqueName: \"kubernetes.io/projected/a8cf2bda-3f86-434a-8397-0aae6d811109-kube-api-access-ftcxh\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:18.061220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:18.061184 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:12:18.180979 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:18.180899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn"] Apr 25 00:12:18.183567 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:12:18.183541 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8cf2bda_3f86_434a_8397_0aae6d811109.slice/crio-47967ad9dfa32582d8bdc92fe86ff6330df023e546465baf07380e6f8685b54c WatchSource:0}: Error finding container 47967ad9dfa32582d8bdc92fe86ff6330df023e546465baf07380e6f8685b54c: Status 404 returned error can't find the container with id 47967ad9dfa32582d8bdc92fe86ff6330df023e546465baf07380e6f8685b54c Apr 25 00:12:18.462885 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:18.462852 2576 generic.go:358] "Generic (PLEG): container finished" podID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerID="c4d3a99c13528532c91686cbae8ff7ce7d6d2408e6d8195d7de6ba26d17d2510" exitCode=2 Apr 25 00:12:18.463069 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:18.462951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerDied","Data":"c4d3a99c13528532c91686cbae8ff7ce7d6d2408e6d8195d7de6ba26d17d2510"} Apr 25 00:12:18.464228 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:18.464203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerStarted","Data":"b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549"} Apr 25 00:12:18.464357 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:18.464232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerStarted","Data":"47967ad9dfa32582d8bdc92fe86ff6330df023e546465baf07380e6f8685b54c"} Apr 25 00:12:19.238430 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:19.238391 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 25 00:12:19.242730 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:19.242700 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 25 00:12:22.478897 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.478863 2576 generic.go:358] "Generic (PLEG): container finished" podID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerID="1bd3ca611e3b347902baddcaa80d0efbd45fa5327a84b14d8e0f9224f94c0293" exitCode=0 Apr 25 00:12:22.479395 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.478951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerDied","Data":"1bd3ca611e3b347902baddcaa80d0efbd45fa5327a84b14d8e0f9224f94c0293"} Apr 25 00:12:22.480174 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.480143 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerID="b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549" exitCode=0 Apr 25 00:12:22.480291 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.480175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerDied","Data":"b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549"} Apr 25 00:12:22.582034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.582012 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:12:22.704864 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.704792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65aeadf0-cf9e-4654-a346-be0d86e992bf-proxy-tls\") pod \"65aeadf0-cf9e-4654-a346-be0d86e992bf\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " Apr 25 00:12:22.705062 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.704890 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmxzd\" (UniqueName: \"kubernetes.io/projected/65aeadf0-cf9e-4654-a346-be0d86e992bf-kube-api-access-lmxzd\") pod \"65aeadf0-cf9e-4654-a346-be0d86e992bf\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " Apr 25 00:12:22.705062 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.704929 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65aeadf0-cf9e-4654-a346-be0d86e992bf-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"65aeadf0-cf9e-4654-a346-be0d86e992bf\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " Apr 25 00:12:22.705062 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.704960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65aeadf0-cf9e-4654-a346-be0d86e992bf-kserve-provision-location\") pod \"65aeadf0-cf9e-4654-a346-be0d86e992bf\" (UID: \"65aeadf0-cf9e-4654-a346-be0d86e992bf\") " Apr 25 00:12:22.705350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.705315 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65aeadf0-cf9e-4654-a346-be0d86e992bf-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "65aeadf0-cf9e-4654-a346-be0d86e992bf" (UID: "65aeadf0-cf9e-4654-a346-be0d86e992bf"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:12:22.705651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.705312 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65aeadf0-cf9e-4654-a346-be0d86e992bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65aeadf0-cf9e-4654-a346-be0d86e992bf" (UID: "65aeadf0-cf9e-4654-a346-be0d86e992bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:12:22.706903 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.706882 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65aeadf0-cf9e-4654-a346-be0d86e992bf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "65aeadf0-cf9e-4654-a346-be0d86e992bf" (UID: "65aeadf0-cf9e-4654-a346-be0d86e992bf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:12:22.707248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.706972 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65aeadf0-cf9e-4654-a346-be0d86e992bf-kube-api-access-lmxzd" (OuterVolumeSpecName: "kube-api-access-lmxzd") pod "65aeadf0-cf9e-4654-a346-be0d86e992bf" (UID: "65aeadf0-cf9e-4654-a346-be0d86e992bf"). InnerVolumeSpecName "kube-api-access-lmxzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:12:22.806354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.806331 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lmxzd\" (UniqueName: \"kubernetes.io/projected/65aeadf0-cf9e-4654-a346-be0d86e992bf-kube-api-access-lmxzd\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:12:22.806456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.806356 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/65aeadf0-cf9e-4654-a346-be0d86e992bf-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:12:22.806456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.806372 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65aeadf0-cf9e-4654-a346-be0d86e992bf-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:12:22.806456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:22.806385 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65aeadf0-cf9e-4654-a346-be0d86e992bf-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:12:23.486281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.486234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" event={"ID":"65aeadf0-cf9e-4654-a346-be0d86e992bf","Type":"ContainerDied","Data":"ecff9717d290736d7eb25cdc577af4a4f0a7e22be65fd20c445f3c7a66627332"} Apr 25 00:12:23.486741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.486299 2576 scope.go:117] "RemoveContainer" containerID="c4d3a99c13528532c91686cbae8ff7ce7d6d2408e6d8195d7de6ba26d17d2510" Apr 25 00:12:23.486741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.486396 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx" Apr 25 00:12:23.501752 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.501728 2576 scope.go:117] "RemoveContainer" containerID="1bd3ca611e3b347902baddcaa80d0efbd45fa5327a84b14d8e0f9224f94c0293" Apr 25 00:12:23.518221 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.518191 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx"] Apr 25 00:12:23.518378 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.518238 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-s6hqx"] Apr 25 00:12:23.518981 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:23.518740 2576 scope.go:117] "RemoveContainer" containerID="782c3ab403b76ebd38be5699774a3507aa9b49ca8ad57f242ac7c005123abc3b" Apr 25 00:12:24.318362 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:12:24.317590 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" path="/var/lib/kubelet/pods/65aeadf0-cf9e-4654-a346-be0d86e992bf/volumes" Apr 25 00:14:37.029939 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:37.029895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:14:37.029939 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:37.029902 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:14:37.914730 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:37.914700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerStarted","Data":"77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e"} Apr 25 00:14:38.919956 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:38.919906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerStarted","Data":"9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71"} Apr 25 00:14:38.920312 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:38.920090 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:14:38.950026 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:38.949973 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" podStartSLOduration=6.7148706 podStartE2EDuration="2m21.949954837s" podCreationTimestamp="2026-04-25 00:12:17 +0000 UTC" firstStartedPulling="2026-04-25 00:12:22.481200589 +0000 UTC m=+1102.762317487" lastFinishedPulling="2026-04-25 00:14:37.716284822 +0000 UTC m=+1237.997401724" observedRunningTime="2026-04-25 00:14:38.94723153 +0000 UTC m=+1239.228348452" watchObservedRunningTime="2026-04-25 00:14:38.949954837 +0000 UTC m=+1239.231071761" Apr 25 00:14:39.923743 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:39.923710 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:14:45.932166 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:14:45.932138 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:15:15.936359 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:15.936328 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:15:17.883408 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.883374 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn"] Apr 25 00:15:17.883962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.883764 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kserve-container" containerID="cri-o://77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e" gracePeriod=30 Apr 25 00:15:17.883962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.883800 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kube-rbac-proxy" containerID="cri-o://9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71" gracePeriod=30 Apr 25 00:15:17.997160 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997130 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm"] Apr 25 00:15:17.997528 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997507 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="storage-initializer" Apr 25 00:15:17.997571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997533 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="storage-initializer" Apr 25 00:15:17.997571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997556 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" Apr 25 00:15:17.997571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997564 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" Apr 25 00:15:17.997658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997574 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kube-rbac-proxy" Apr 25 00:15:17.997658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997583 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kube-rbac-proxy" Apr 25 00:15:17.997721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997660 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kube-rbac-proxy" Apr 25 00:15:17.997721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:17.997672 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="65aeadf0-cf9e-4654-a346-be0d86e992bf" containerName="kserve-container" Apr 25 00:15:18.000968 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.000950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.003190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.003167 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 25 00:15:18.003190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.003190 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 25 00:15:18.008298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.008273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm"] Apr 25 00:15:18.031774 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.031731 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerID="9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71" exitCode=2 Apr 25 00:15:18.031910 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.031795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerDied","Data":"9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71"} Apr 25 00:15:18.106392 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.106361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.106590 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.106444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.106590 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.106480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.106590 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.106518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8scxp\" (UniqueName: \"kubernetes.io/projected/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kube-api-access-8scxp\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.207138 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.207054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.207138 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.207095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.207138 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.207123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8scxp\" (UniqueName: \"kubernetes.io/projected/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kube-api-access-8scxp\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.207387 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.207161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.207493 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.207473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.207769 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.207748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.209734 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.209710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.215102 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.215078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8scxp\" (UniqueName: \"kubernetes.io/projected/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kube-api-access-8scxp\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.311732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.311704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:18.429153 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.429124 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm"] Apr 25 00:15:18.432305 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:15:18.432280 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24ce7e8_2a38_46a9_99c3_3b71e8abc71e.slice/crio-d38f2ba45a81ba08edeac9d15f54fb3f8b930d3c478433130456f7078d538ac5 WatchSource:0}: Error finding container d38f2ba45a81ba08edeac9d15f54fb3f8b930d3c478433130456f7078d538ac5: Status 404 returned error can't find the container with id d38f2ba45a81ba08edeac9d15f54fb3f8b930d3c478433130456f7078d538ac5 Apr 25 00:15:18.928971 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:18.928946 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:15:19.013347 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.013268 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8cf2bda-3f86-434a-8397-0aae6d811109-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"a8cf2bda-3f86-434a-8397-0aae6d811109\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " Apr 25 00:15:19.013347 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.013324 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8cf2bda-3f86-434a-8397-0aae6d811109-proxy-tls\") pod \"a8cf2bda-3f86-434a-8397-0aae6d811109\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " Apr 25 00:15:19.013563 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.013360 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cf2bda-3f86-434a-8397-0aae6d811109-kserve-provision-location\") pod \"a8cf2bda-3f86-434a-8397-0aae6d811109\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " Apr 25 00:15:19.013563 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.013408 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftcxh\" (UniqueName: \"kubernetes.io/projected/a8cf2bda-3f86-434a-8397-0aae6d811109-kube-api-access-ftcxh\") pod \"a8cf2bda-3f86-434a-8397-0aae6d811109\" (UID: \"a8cf2bda-3f86-434a-8397-0aae6d811109\") " Apr 25 00:15:19.013696 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.013665 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cf2bda-3f86-434a-8397-0aae6d811109-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "a8cf2bda-3f86-434a-8397-0aae6d811109" (UID: "a8cf2bda-3f86-434a-8397-0aae6d811109"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:19.013752 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.013672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cf2bda-3f86-434a-8397-0aae6d811109-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8cf2bda-3f86-434a-8397-0aae6d811109" (UID: "a8cf2bda-3f86-434a-8397-0aae6d811109"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:15:19.015514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.015491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8cf2bda-3f86-434a-8397-0aae6d811109-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a8cf2bda-3f86-434a-8397-0aae6d811109" (UID: "a8cf2bda-3f86-434a-8397-0aae6d811109"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:15:19.015618 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.015549 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cf2bda-3f86-434a-8397-0aae6d811109-kube-api-access-ftcxh" (OuterVolumeSpecName: "kube-api-access-ftcxh") pod "a8cf2bda-3f86-434a-8397-0aae6d811109" (UID: "a8cf2bda-3f86-434a-8397-0aae6d811109"). InnerVolumeSpecName "kube-api-access-ftcxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:15:19.036624 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.036593 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerID="77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e" exitCode=0 Apr 25 00:15:19.036721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.036662 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" Apr 25 00:15:19.036721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.036667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerDied","Data":"77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e"} Apr 25 00:15:19.036721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.036706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn" event={"ID":"a8cf2bda-3f86-434a-8397-0aae6d811109","Type":"ContainerDied","Data":"47967ad9dfa32582d8bdc92fe86ff6330df023e546465baf07380e6f8685b54c"} Apr 25 00:15:19.036854 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.036723 2576 scope.go:117] "RemoveContainer" containerID="9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71" Apr 25 00:15:19.038354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.038324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerStarted","Data":"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718"} Apr 25 00:15:19.038468 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.038359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerStarted","Data":"d38f2ba45a81ba08edeac9d15f54fb3f8b930d3c478433130456f7078d538ac5"} Apr 25 00:15:19.047245 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.046939 2576 scope.go:117] "RemoveContainer" containerID="77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e" Apr 25 00:15:19.054580 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.054559 2576 scope.go:117] "RemoveContainer" containerID="b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549" Apr 25 00:15:19.062512 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.062497 2576 scope.go:117] "RemoveContainer" containerID="9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71" Apr 25 00:15:19.062811 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:19.062783 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71\": container with ID starting with 9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71 not found: ID does not exist" containerID="9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71" Apr 25 00:15:19.062881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.062818 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71"} err="failed to get container status \"9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71\": rpc error: code = NotFound desc = could not find container \"9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71\": container with ID starting with 9a6f1d2b9f1b16df04e4d20fd9581a52f4555728f1dc536c798019ff16e1fe71 not found: ID does not exist" Apr 25 00:15:19.062881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.062833 2576 scope.go:117] "RemoveContainer" containerID="77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e" Apr 25 00:15:19.063103 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:19.063086 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e\": container with ID starting with 77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e not found: ID does not exist" containerID="77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e" Apr 25 00:15:19.063142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.063108 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e"} err="failed to get container status \"77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e\": rpc error: code = NotFound desc = could not find container \"77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e\": container with ID starting with 77f3774132eba18666f1e80c5ece3e2b5f3097947683b3ef033e4cae5140fd4e not found: ID does not exist" Apr 25 00:15:19.063142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.063123 2576 scope.go:117] "RemoveContainer" containerID="b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549" Apr 25 00:15:19.063371 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:19.063348 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549\": container with ID starting with b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549 not found: ID does not exist" containerID="b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549" Apr 25 00:15:19.063410 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.063379 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549"} err="failed to get container status \"b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549\": rpc error: code = NotFound desc = could not find container \"b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549\": container with ID starting with b03ce628877850c5d1e60448dae5db31236e445a0422184b4dc28de8733af549 not found: ID does not exist" Apr 25 00:15:19.070747 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.070726 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn"] Apr 25 00:15:19.073782 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.073762 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-k77zn"] Apr 25 00:15:19.114733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.114697 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8cf2bda-3f86-434a-8397-0aae6d811109-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:19.114733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.114736 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8cf2bda-3f86-434a-8397-0aae6d811109-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:19.114902 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.114754 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cf2bda-3f86-434a-8397-0aae6d811109-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:19.114902 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:19.114768 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftcxh\" (UniqueName: \"kubernetes.io/projected/a8cf2bda-3f86-434a-8397-0aae6d811109-kube-api-access-ftcxh\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:20.315317 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:20.315280 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" path="/var/lib/kubelet/pods/a8cf2bda-3f86-434a-8397-0aae6d811109/volumes" Apr 25 00:15:23.051464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:23.051430 2576 generic.go:358] "Generic (PLEG): container finished" podID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerID="15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718" exitCode=0 Apr 25 00:15:23.051829 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:23.051505 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerDied","Data":"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718"} Apr 25 00:15:24.055741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:24.055709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerStarted","Data":"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca"} Apr 25 00:15:24.055741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:24.055743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerStarted","Data":"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc"} Apr 25 00:15:24.056196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:24.056037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:24.056196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:24.056174 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:24.057351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:24.057328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 25 00:15:24.073521 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:24.073466 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" podStartSLOduration=7.073448638 podStartE2EDuration="7.073448638s" podCreationTimestamp="2026-04-25 00:15:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:15:24.072238663 +0000 UTC m=+1284.353355583" watchObservedRunningTime="2026-04-25 00:15:24.073448638 +0000 UTC m=+1284.354565562" Apr 25 00:15:25.059265 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:25.059220 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 25 00:15:30.063960 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:30.063908 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:30.064454 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:30.064424 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 25 00:15:40.065613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:40.065584 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:48.031266 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.031231 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm"] Apr 25 00:15:48.031681 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.031623 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" containerID="cri-o://79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc" gracePeriod=30 Apr 25 00:15:48.031742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.031673 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kube-rbac-proxy" containerID="cri-o://249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca" gracePeriod=30 Apr 25 00:15:48.104585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104553 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk"] Apr 25 00:15:48.104863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104849 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="storage-initializer" Apr 25 00:15:48.104863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104864 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="storage-initializer" Apr 25 00:15:48.105012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104879 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kserve-container" Apr 25 00:15:48.105012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104885 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kserve-container" Apr 25 00:15:48.105012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104898 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kube-rbac-proxy" Apr 25 00:15:48.105012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104903 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kube-rbac-proxy" Apr 25 00:15:48.105012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104975 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kube-rbac-proxy" Apr 25 00:15:48.105012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.104989 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8cf2bda-3f86-434a-8397-0aae6d811109" containerName="kserve-container" Apr 25 00:15:48.107958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.107942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.111336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.111290 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 25 00:15:48.111336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.111299 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:15:48.117010 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.116975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk"] Apr 25 00:15:48.229214 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.229188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nddmb\" (UniqueName: \"kubernetes.io/projected/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kube-api-access-nddmb\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.229354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.229266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ecdde27-2dbb-4c74-ad52-e4492c47e947-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.229354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.229303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.229468 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.229364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.330135 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.330052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ecdde27-2dbb-4c74-ad52-e4492c47e947-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.330135 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.330093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.330352 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:48.330296 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 25 00:15:48.330352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.330320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.330455 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:48.330376 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls podName:1ecdde27-2dbb-4c74-ad52-e4492c47e947 nodeName:}" failed. No retries permitted until 2026-04-25 00:15:48.830352083 +0000 UTC m=+1309.111468984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" (UID: "1ecdde27-2dbb-4c74-ad52-e4492c47e947") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 25 00:15:48.330455 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.330423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nddmb\" (UniqueName: \"kubernetes.io/projected/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kube-api-access-nddmb\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.330660 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.330641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.330754 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.330719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ecdde27-2dbb-4c74-ad52-e4492c47e947-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.339032 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.339011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nddmb\" (UniqueName: \"kubernetes.io/projected/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kube-api-access-nddmb\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.661058 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.661032 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:48.734141 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.734117 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kserve-provision-location\") pod \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " Apr 25 00:15:48.734270 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.734149 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8scxp\" (UniqueName: \"kubernetes.io/projected/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kube-api-access-8scxp\") pod \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " Apr 25 00:15:48.734270 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.734183 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " Apr 25 00:15:48.734270 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.734215 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-proxy-tls\") pod \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\" (UID: \"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e\") " Apr 25 00:15:48.734528 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.734500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" (UID: "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:15:48.734607 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.734536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" (UID: "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:15:48.736318 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.736293 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kube-api-access-8scxp" (OuterVolumeSpecName: "kube-api-access-8scxp") pod "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" (UID: "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e"). InnerVolumeSpecName "kube-api-access-8scxp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:15:48.736402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.736366 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" (UID: "e24ce7e8-2a38-46a9-99c3-3b71e8abc71e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:15:48.834825 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.834797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:48.834952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.834870 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:48.834952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.834883 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:48.834952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.834893 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:48.834952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.834902 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8scxp\" (UniqueName: \"kubernetes.io/projected/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e-kube-api-access-8scxp\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:15:48.837158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:48.837137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:49.019014 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.018979 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:49.130123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130091 2576 generic.go:358] "Generic (PLEG): container finished" podID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerID="249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca" exitCode=2 Apr 25 00:15:49.130123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130118 2576 generic.go:358] "Generic (PLEG): container finished" podID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerID="79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc" exitCode=0 Apr 25 00:15:49.130613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerDied","Data":"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca"} Apr 25 00:15:49.130613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130179 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" Apr 25 00:15:49.130613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130193 2576 scope.go:117] "RemoveContainer" containerID="249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca" Apr 25 00:15:49.130613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerDied","Data":"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc"} Apr 25 00:15:49.130613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.130293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm" event={"ID":"e24ce7e8-2a38-46a9-99c3-3b71e8abc71e","Type":"ContainerDied","Data":"d38f2ba45a81ba08edeac9d15f54fb3f8b930d3c478433130456f7078d538ac5"} Apr 25 00:15:49.138420 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.138402 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk"] Apr 25 00:15:49.139340 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.139271 2576 scope.go:117] "RemoveContainer" containerID="79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc" Apr 25 00:15:49.141058 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:15:49.141033 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ecdde27_2dbb_4c74_ad52_e4492c47e947.slice/crio-7163e0d6a8f08d4558388a7abe1e767230086654a9eaac635cfab162cc2d44d0 WatchSource:0}: Error finding container 7163e0d6a8f08d4558388a7abe1e767230086654a9eaac635cfab162cc2d44d0: Status 404 returned error can't find the container with id 7163e0d6a8f08d4558388a7abe1e767230086654a9eaac635cfab162cc2d44d0 Apr 25 00:15:49.142683 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.142668 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:15:49.149613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.149597 2576 scope.go:117] "RemoveContainer" containerID="15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718" Apr 25 00:15:49.156441 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.156408 2576 scope.go:117] "RemoveContainer" containerID="249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca" Apr 25 00:15:49.156720 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:49.156702 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca\": container with ID starting with 249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca not found: ID does not exist" containerID="249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca" Apr 25 00:15:49.156785 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.156731 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca"} err="failed to get container status \"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca\": rpc error: code = NotFound desc = could not find container \"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca\": container with ID starting with 249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca not found: ID does not exist" Apr 25 00:15:49.156785 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.156754 2576 scope.go:117] "RemoveContainer" containerID="79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc" Apr 25 00:15:49.157137 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:49.157112 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc\": container with ID starting with 79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc not found: ID does not exist" containerID="79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc" Apr 25 00:15:49.157239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.157150 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc"} err="failed to get container status \"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc\": rpc error: code = NotFound desc = could not find container \"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc\": container with ID starting with 79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc not found: ID does not exist" Apr 25 00:15:49.157239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.157170 2576 scope.go:117] "RemoveContainer" containerID="15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718" Apr 25 00:15:49.157893 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:15:49.157852 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718\": container with ID starting with 15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718 not found: ID does not exist" containerID="15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718" Apr 25 00:15:49.158008 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.157898 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718"} err="failed to get container status \"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718\": rpc error: code = NotFound desc = could not find container \"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718\": container with ID starting with 15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718 not found: ID does not exist" Apr 25 00:15:49.158008 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.157946 2576 scope.go:117] "RemoveContainer" containerID="249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca" Apr 25 00:15:49.158257 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.158221 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca"} err="failed to get container status \"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca\": rpc error: code = NotFound desc = could not find container \"249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca\": container with ID starting with 249fabfad6e3adc226dfa43f9454aaeef82ffcbc6774a03fb1cf0dd5f3af19ca not found: ID does not exist" Apr 25 00:15:49.158402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.158259 2576 scope.go:117] "RemoveContainer" containerID="79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc" Apr 25 00:15:49.158508 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.158487 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc"} err="failed to get container status \"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc\": rpc error: code = NotFound desc = could not find container \"79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc\": container with ID starting with 79c3d5d14082f482bf8f17d09183fe8b0396438e807439f8737134103cab34dc not found: ID does not exist" Apr 25 00:15:49.158581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.158512 2576 scope.go:117] "RemoveContainer" containerID="15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718" Apr 25 00:15:49.158845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.158825 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718"} err="failed to get container status \"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718\": rpc error: code = NotFound desc = could not find container \"15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718\": container with ID starting with 15254865be2a39337c847b660a5f4e9d9a890e56247685a368104bc05e27a718 not found: ID does not exist" Apr 25 00:15:49.159616 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.159597 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm"] Apr 25 00:15:49.163799 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:49.163774 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-mpttm"] Apr 25 00:15:50.134597 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:50.134558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerStarted","Data":"402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485"} Apr 25 00:15:50.134597 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:50.134598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerStarted","Data":"7163e0d6a8f08d4558388a7abe1e767230086654a9eaac635cfab162cc2d44d0"} Apr 25 00:15:50.315712 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:50.315672 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" path="/var/lib/kubelet/pods/e24ce7e8-2a38-46a9-99c3-3b71e8abc71e/volumes" Apr 25 00:15:53.145537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:53.145506 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerID="402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485" exitCode=0 Apr 25 00:15:53.145863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:53.145567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerDied","Data":"402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485"} Apr 25 00:15:54.150528 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:54.150494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerStarted","Data":"02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28"} Apr 25 00:15:54.150528 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:54.150534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerStarted","Data":"ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2"} Apr 25 00:15:54.150986 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:54.150878 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:54.150986 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:54.150932 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:15:54.168951 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:15:54.168885 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" podStartSLOduration=6.168871575 podStartE2EDuration="6.168871575s" podCreationTimestamp="2026-04-25 00:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:15:54.168084318 +0000 UTC m=+1314.449201262" watchObservedRunningTime="2026-04-25 00:15:54.168871575 +0000 UTC m=+1314.449988495" Apr 25 00:16:00.158439 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:00.158412 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:16:30.161809 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:30.161778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:16:38.182074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.182034 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk"] Apr 25 00:16:38.182545 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.182411 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kserve-container" containerID="cri-o://ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2" gracePeriod=30 Apr 25 00:16:38.182545 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.182459 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kube-rbac-proxy" containerID="cri-o://02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28" gracePeriod=30 Apr 25 00:16:38.257115 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257083 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng"] Apr 25 00:16:38.257409 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257395 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kube-rbac-proxy" Apr 25 00:16:38.257451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257411 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kube-rbac-proxy" Apr 25 00:16:38.257451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257424 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="storage-initializer" Apr 25 00:16:38.257451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257431 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="storage-initializer" Apr 25 00:16:38.257451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257438 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" Apr 25 00:16:38.257451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257443 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" Apr 25 00:16:38.257593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257494 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kserve-container" Apr 25 00:16:38.257593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.257502 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e24ce7e8-2a38-46a9-99c3-3b71e8abc71e" containerName="kube-rbac-proxy" Apr 25 00:16:38.261046 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.261026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.263368 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.263346 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 25 00:16:38.263504 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.263483 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 25 00:16:38.271838 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.271815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng"] Apr 25 00:16:38.300196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.300171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7pv\" (UniqueName: \"kubernetes.io/projected/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kube-api-access-pn7pv\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.300281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.300210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.300281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.300259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.300428 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.300368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.401005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.400973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.401212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.401016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7pv\" (UniqueName: \"kubernetes.io/projected/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kube-api-access-pn7pv\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.401212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.401045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.401212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.401074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.401439 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.401415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.401710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.401684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.403634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.403607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.409689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.409664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7pv\" (UniqueName: \"kubernetes.io/projected/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kube-api-access-pn7pv\") pod \"isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.571061 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.571017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:38.701085 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:38.701056 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng"] Apr 25 00:16:38.703346 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:16:38.703316 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-27e61bacb963f2f6f64ee23444423efc5c9d031027676568a76fc3e84b985905 WatchSource:0}: Error finding container 27e61bacb963f2f6f64ee23444423efc5c9d031027676568a76fc3e84b985905: Status 404 returned error can't find the container with id 27e61bacb963f2f6f64ee23444423efc5c9d031027676568a76fc3e84b985905 Apr 25 00:16:39.286127 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.286087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerStarted","Data":"811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc"} Apr 25 00:16:39.286591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.286136 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerStarted","Data":"27e61bacb963f2f6f64ee23444423efc5c9d031027676568a76fc3e84b985905"} Apr 25 00:16:39.288420 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.288395 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerID="02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28" exitCode=2 Apr 25 00:16:39.288560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.288437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerDied","Data":"02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28"} Apr 25 00:16:39.421991 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.421968 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:16:39.507793 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.507708 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ecdde27-2dbb-4c74-ad52-e4492c47e947-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " Apr 25 00:16:39.507793 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.507755 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nddmb\" (UniqueName: \"kubernetes.io/projected/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kube-api-access-nddmb\") pod \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " Apr 25 00:16:39.508039 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.507800 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls\") pod \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " Apr 25 00:16:39.508108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.508063 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ecdde27-2dbb-4c74-ad52-e4492c47e947-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "1ecdde27-2dbb-4c74-ad52-e4492c47e947" (UID: "1ecdde27-2dbb-4c74-ad52-e4492c47e947"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:16:39.510015 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.509992 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1ecdde27-2dbb-4c74-ad52-e4492c47e947" (UID: "1ecdde27-2dbb-4c74-ad52-e4492c47e947"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:16:39.510115 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.510089 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kube-api-access-nddmb" (OuterVolumeSpecName: "kube-api-access-nddmb") pod "1ecdde27-2dbb-4c74-ad52-e4492c47e947" (UID: "1ecdde27-2dbb-4c74-ad52-e4492c47e947"). InnerVolumeSpecName "kube-api-access-nddmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:16:39.609060 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.609007 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kserve-provision-location\") pod \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\" (UID: \"1ecdde27-2dbb-4c74-ad52-e4492c47e947\") " Apr 25 00:16:39.609271 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.609247 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ecdde27-2dbb-4c74-ad52-e4492c47e947-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:16:39.609336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.609273 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nddmb\" (UniqueName: \"kubernetes.io/projected/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kube-api-access-nddmb\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:16:39.609336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.609290 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ecdde27-2dbb-4c74-ad52-e4492c47e947-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:16:39.609408 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.609324 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ecdde27-2dbb-4c74-ad52-e4492c47e947" (UID: "1ecdde27-2dbb-4c74-ad52-e4492c47e947"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:16:39.710610 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:39.710556 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ecdde27-2dbb-4c74-ad52-e4492c47e947-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:16:40.293029 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.292993 2576 generic.go:358] "Generic (PLEG): container finished" podID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerID="ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2" exitCode=0 Apr 25 00:16:40.293478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.293083 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" Apr 25 00:16:40.293478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.293125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerDied","Data":"ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2"} Apr 25 00:16:40.293478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.293175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk" event={"ID":"1ecdde27-2dbb-4c74-ad52-e4492c47e947","Type":"ContainerDied","Data":"7163e0d6a8f08d4558388a7abe1e767230086654a9eaac635cfab162cc2d44d0"} Apr 25 00:16:40.293478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.293196 2576 scope.go:117] "RemoveContainer" containerID="02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28" Apr 25 00:16:40.301593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.301424 2576 scope.go:117] "RemoveContainer" containerID="ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2" Apr 25 00:16:40.308254 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.308232 2576 scope.go:117] "RemoveContainer" containerID="402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485" Apr 25 00:16:40.316703 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.316684 2576 scope.go:117] "RemoveContainer" containerID="02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28" Apr 25 00:16:40.316842 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.316821 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk"] Apr 25 00:16:40.317028 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:16:40.316999 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28\": container with ID starting with 02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28 not found: ID does not exist" containerID="02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28" Apr 25 00:16:40.317154 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.317033 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28"} err="failed to get container status \"02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28\": rpc error: code = NotFound desc = could not find container \"02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28\": container with ID starting with 02681b8dbc639ddadb9255ccb2bddcda8d405bf2aa6b4e9aff05b6180ca68e28 not found: ID does not exist" Apr 25 00:16:40.317154 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.317053 2576 scope.go:117] "RemoveContainer" containerID="ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2" Apr 25 00:16:40.317154 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.317042 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-cnjnk"] Apr 25 00:16:40.317326 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:16:40.317308 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2\": container with ID starting with ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2 not found: ID does not exist" containerID="ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2" Apr 25 00:16:40.317362 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.317333 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2"} err="failed to get container status \"ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2\": rpc error: code = NotFound desc = could not find container \"ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2\": container with ID starting with ddc13ddec93959d8c8d8b70ef77649c1830a78a0fb6b9c2311d62eb90f7640e2 not found: ID does not exist" Apr 25 00:16:40.317362 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.317350 2576 scope.go:117] "RemoveContainer" containerID="402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485" Apr 25 00:16:40.317560 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:16:40.317544 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485\": container with ID starting with 402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485 not found: ID does not exist" containerID="402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485" Apr 25 00:16:40.317643 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:40.317566 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485"} err="failed to get container status \"402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485\": rpc error: code = NotFound desc = could not find container \"402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485\": container with ID starting with 402f902b7e4d95a00c0ad3ab5a5d44ebdc598814efb630a65315526094e70485 not found: ID does not exist" Apr 25 00:16:42.315514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:42.315475 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" path="/var/lib/kubelet/pods/1ecdde27-2dbb-4c74-ad52-e4492c47e947/volumes" Apr 25 00:16:43.309950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:43.309890 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerID="811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc" exitCode=0 Apr 25 00:16:43.310126 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:43.309961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerDied","Data":"811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc"} Apr 25 00:16:44.316491 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:44.316440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerStarted","Data":"5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a"} Apr 25 00:16:47.328568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:47.328535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerStarted","Data":"963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12"} Apr 25 00:16:47.328568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:47.328569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerStarted","Data":"c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f"} Apr 25 00:16:47.329024 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:47.328742 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:47.329024 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:47.328889 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:47.348223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:47.348177 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podStartSLOduration=6.457978302 podStartE2EDuration="9.348166134s" podCreationTimestamp="2026-04-25 00:16:38 +0000 UTC" firstStartedPulling="2026-04-25 00:16:43.372483762 +0000 UTC m=+1363.653600664" lastFinishedPulling="2026-04-25 00:16:46.262671584 +0000 UTC m=+1366.543788496" observedRunningTime="2026-04-25 00:16:47.346954882 +0000 UTC m=+1367.628071799" watchObservedRunningTime="2026-04-25 00:16:47.348166134 +0000 UTC m=+1367.629283054" Apr 25 00:16:48.330814 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:48.330786 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:16:54.338585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:16:54.338558 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:17:14.340297 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:14.340264 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:17:54.341203 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:54.341177 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:17:58.324060 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.324029 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng"] Apr 25 00:17:58.324515 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.324474 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" containerID="cri-o://5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a" gracePeriod=30 Apr 25 00:17:58.324642 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.324515 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-agent" containerID="cri-o://963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12" gracePeriod=30 Apr 25 00:17:58.324642 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.324515 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" containerID="cri-o://c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f" gracePeriod=30 Apr 25 00:17:58.388336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388313 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8"] Apr 25 00:17:58.388631 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388616 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="storage-initializer" Apr 25 00:17:58.388705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388634 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="storage-initializer" Apr 25 00:17:58.388705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388664 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kserve-container" Apr 25 00:17:58.388705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388674 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kserve-container" Apr 25 00:17:58.388705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388683 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kube-rbac-proxy" Apr 25 00:17:58.388705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388692 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kube-rbac-proxy" Apr 25 00:17:58.388977 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388755 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kube-rbac-proxy" Apr 25 00:17:58.388977 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.388770 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ecdde27-2dbb-4c74-ad52-e4492c47e947" containerName="kserve-container" Apr 25 00:17:58.391050 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.391033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.393213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.393195 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 25 00:17:58.393301 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.393215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 25 00:17:58.400299 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.400278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8"] Apr 25 00:17:58.476035 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.476008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ac957c-316d-418b-a10b-fc8abdba63e0-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.476140 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.476048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6ac957c-316d-418b-a10b-fc8abdba63e0-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.476140 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.476076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c6ac957c-316d-418b-a10b-fc8abdba63e0-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.476221 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.476133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2jrh\" (UniqueName: \"kubernetes.io/projected/c6ac957c-316d-418b-a10b-fc8abdba63e0-kube-api-access-l2jrh\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.520044 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.520004 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerID="c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f" exitCode=2 Apr 25 00:17:58.520044 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.520048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerDied","Data":"c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f"} Apr 25 00:17:58.577021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.576942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ac957c-316d-418b-a10b-fc8abdba63e0-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.577021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.576979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6ac957c-316d-418b-a10b-fc8abdba63e0-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.577021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.577006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c6ac957c-316d-418b-a10b-fc8abdba63e0-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.577282 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.577036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2jrh\" (UniqueName: \"kubernetes.io/projected/c6ac957c-316d-418b-a10b-fc8abdba63e0-kube-api-access-l2jrh\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.577374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.577351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ac957c-316d-418b-a10b-fc8abdba63e0-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.577709 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.577675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c6ac957c-316d-418b-a10b-fc8abdba63e0-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.579673 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.579651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6ac957c-316d-418b-a10b-fc8abdba63e0-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.585170 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.585127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2jrh\" (UniqueName: \"kubernetes.io/projected/c6ac957c-316d-418b-a10b-fc8abdba63e0-kube-api-access-l2jrh\") pod \"isvc-paddle-predictor-6b8b7cfb4b-776w8\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.702137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.702105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:17:58.825126 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:58.825017 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8"] Apr 25 00:17:58.827611 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:17:58.827546 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ac957c_316d_418b_a10b_fc8abdba63e0.slice/crio-d38724075d809129586953baec4ff9ad2c21e75989f13cb70a0a36b1a4eb6596 WatchSource:0}: Error finding container d38724075d809129586953baec4ff9ad2c21e75989f13cb70a0a36b1a4eb6596: Status 404 returned error can't find the container with id d38724075d809129586953baec4ff9ad2c21e75989f13cb70a0a36b1a4eb6596 Apr 25 00:17:59.334486 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:59.334440 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 25 00:17:59.523959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:59.523894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerStarted","Data":"e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6"} Apr 25 00:17:59.524134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:17:59.523966 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerStarted","Data":"d38724075d809129586953baec4ff9ad2c21e75989f13cb70a0a36b1a4eb6596"} Apr 25 00:18:00.533580 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:00.533549 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerID="5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a" exitCode=0 Apr 25 00:18:00.533963 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:00.533618 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerDied","Data":"5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a"} Apr 25 00:18:00.772583 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:00.772551 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:03.225881 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:03.225851 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:03.542960 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:03.542898 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerID="e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6" exitCode=0 Apr 25 00:18:03.543268 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:03.542975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerDied","Data":"e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6"} Apr 25 00:18:04.333612 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:04.333569 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 25 00:18:04.339071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:04.339033 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.31:8080: connect: connection refused" Apr 25 00:18:09.334402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:09.334350 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 25 00:18:09.334894 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:09.334540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:18:13.265006 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:13.264971 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:14.333934 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:14.333882 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 25 00:18:14.339867 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:14.339832 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.31:8080: connect: connection refused" Apr 25 00:18:15.583185 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:15.583150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerStarted","Data":"9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5"} Apr 25 00:18:15.583185 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:15.583188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerStarted","Data":"922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7"} Apr 25 00:18:15.583666 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:15.583486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:18:15.583666 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:15.583630 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:18:15.584801 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:15.584777 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:18:15.600974 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:15.600900 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podStartSLOduration=6.018887146 podStartE2EDuration="17.600883829s" podCreationTimestamp="2026-04-25 00:17:58 +0000 UTC" firstStartedPulling="2026-04-25 00:18:03.544185775 +0000 UTC m=+1443.825302673" lastFinishedPulling="2026-04-25 00:18:15.126182448 +0000 UTC m=+1455.407299356" observedRunningTime="2026-04-25 00:18:15.599688213 +0000 UTC m=+1455.880805133" watchObservedRunningTime="2026-04-25 00:18:15.600883829 +0000 UTC m=+1455.882000748" Apr 25 00:18:15.735343 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:15.731177 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:16.586167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:16.586124 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:18:19.334309 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:19.334269 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 25 00:18:21.591132 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:21.591096 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:18:21.591665 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:21.591637 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:18:23.273481 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:23.273446 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:24.334446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:24.334404 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.31:8643/healthz\": dial tcp 10.134.0.31:8643: connect: connection refused" Apr 25 00:18:24.339084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:24.339054 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.31:8080: connect: connection refused" Apr 25 00:18:24.339227 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:24.339166 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:18:28.492910 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.492885 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:18:28.512577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.512542 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-proxy-tls\") pod \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " Apr 25 00:18:28.512733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.512594 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kserve-provision-location\") pod \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " Apr 25 00:18:28.512733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.512681 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn7pv\" (UniqueName: \"kubernetes.io/projected/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kube-api-access-pn7pv\") pod \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " Apr 25 00:18:28.512733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.512729 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\" (UID: \"6a612862-dbb6-4eaf-bf08-ee50f2f2df52\") " Apr 25 00:18:28.513034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.512968 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a612862-dbb6-4eaf-bf08-ee50f2f2df52" (UID: "6a612862-dbb6-4eaf-bf08-ee50f2f2df52"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:18:28.513184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.513153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "6a612862-dbb6-4eaf-bf08-ee50f2f2df52" (UID: "6a612862-dbb6-4eaf-bf08-ee50f2f2df52"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:18:28.515332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.515214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kube-api-access-pn7pv" (OuterVolumeSpecName: "kube-api-access-pn7pv") pod "6a612862-dbb6-4eaf-bf08-ee50f2f2df52" (UID: "6a612862-dbb6-4eaf-bf08-ee50f2f2df52"). InnerVolumeSpecName "kube-api-access-pn7pv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:18:28.515332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.515272 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6a612862-dbb6-4eaf-bf08-ee50f2f2df52" (UID: "6a612862-dbb6-4eaf-bf08-ee50f2f2df52"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:18:28.613453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.613376 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pn7pv\" (UniqueName: \"kubernetes.io/projected/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kube-api-access-pn7pv\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:18:28.613453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.613402 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:18:28.613453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.613412 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:18:28.613453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.613421 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a612862-dbb6-4eaf-bf08-ee50f2f2df52-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:18:28.622758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.622728 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerID="963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12" exitCode=137 Apr 25 00:18:28.622942 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.622828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerDied","Data":"963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12"} Apr 25 00:18:28.622942 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.622848 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" Apr 25 00:18:28.622942 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.622857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng" event={"ID":"6a612862-dbb6-4eaf-bf08-ee50f2f2df52","Type":"ContainerDied","Data":"27e61bacb963f2f6f64ee23444423efc5c9d031027676568a76fc3e84b985905"} Apr 25 00:18:28.622942 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.622876 2576 scope.go:117] "RemoveContainer" containerID="c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f" Apr 25 00:18:28.631063 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.631042 2576 scope.go:117] "RemoveContainer" containerID="963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12" Apr 25 00:18:28.637816 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.637795 2576 scope.go:117] "RemoveContainer" containerID="5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a" Apr 25 00:18:28.645099 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.645076 2576 scope.go:117] "RemoveContainer" containerID="811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc" Apr 25 00:18:28.646213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.646195 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng"] Apr 25 00:18:28.651990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.651970 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6c44fd9c54-vg7ng"] Apr 25 00:18:28.652215 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.652197 2576 scope.go:117] "RemoveContainer" containerID="c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f" Apr 25 00:18:28.652494 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:28.652476 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f\": container with ID starting with c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f not found: ID does not exist" containerID="c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f" Apr 25 00:18:28.652544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.652500 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f"} err="failed to get container status \"c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f\": rpc error: code = NotFound desc = could not find container \"c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f\": container with ID starting with c9b525939fabc16a449831eaa3e891e799695388a215091b75f76cc907adc51f not found: ID does not exist" Apr 25 00:18:28.652544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.652518 2576 scope.go:117] "RemoveContainer" containerID="963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12" Apr 25 00:18:28.652772 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:28.652754 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12\": container with ID starting with 963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12 not found: ID does not exist" containerID="963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12" Apr 25 00:18:28.652820 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.652778 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12"} err="failed to get container status \"963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12\": rpc error: code = NotFound desc = could not find container \"963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12\": container with ID starting with 963878639b28ab5237aabcb742a7dc507c0fb0b792514f2198af2913fce02e12 not found: ID does not exist" Apr 25 00:18:28.652820 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.652794 2576 scope.go:117] "RemoveContainer" containerID="5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a" Apr 25 00:18:28.653100 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:28.653083 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a\": container with ID starting with 5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a not found: ID does not exist" containerID="5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a" Apr 25 00:18:28.653156 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.653114 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a"} err="failed to get container status \"5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a\": rpc error: code = NotFound desc = could not find container \"5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a\": container with ID starting with 5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a not found: ID does not exist" Apr 25 00:18:28.653156 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.653129 2576 scope.go:117] "RemoveContainer" containerID="811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc" Apr 25 00:18:28.653346 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:28.653330 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc\": container with ID starting with 811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc not found: ID does not exist" containerID="811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc" Apr 25 00:18:28.653390 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:28.653351 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc"} err="failed to get container status \"811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc\": rpc error: code = NotFound desc = could not find container \"811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc\": container with ID starting with 811f7b02c1cd22d7b08ea50ce1ae6a3699e37fe21c99556eb63b0912d46c20dc not found: ID does not exist" Apr 25 00:18:30.319360 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:30.316111 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" path="/var/lib/kubelet/pods/6a612862-dbb6-4eaf-bf08-ee50f2f2df52/volumes" Apr 25 00:18:30.768097 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:30.768074 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:31.592238 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:31.592200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:18:33.280700 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:33.280671 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:33.409525 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:33.409493 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:33.410009 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:33.409986 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:41.591807 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:41.591762 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:18:43.314797 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:43.314760 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:45.734549 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:45.734511 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:18:51.592169 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:18:51.592114 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:18:53.353010 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:18:53.352976 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a612862_dbb6_4eaf_bf08_ee50f2f2df52.slice/crio-conmon-5b6106ca2503d9c0428150361f20f7844f2dd1c84399283bbb334a077073fd1a.scope\": RecentStats: unable to find data in memory cache]" Apr 25 00:19:01.592812 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:01.592773 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:19:09.824759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.824672 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8"] Apr 25 00:19:09.825139 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.825097 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" containerID="cri-o://922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7" gracePeriod=30 Apr 25 00:19:09.825201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.825137 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kube-rbac-proxy" containerID="cri-o://9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5" gracePeriod=30 Apr 25 00:19:09.913676 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.913640 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f"] Apr 25 00:19:09.913961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.913936 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" Apr 25 00:19:09.913961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.913956 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.913968 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="storage-initializer" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.913974 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="storage-initializer" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.913992 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.914000 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.914010 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-agent" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.914015 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-agent" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.914065 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-container" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.914077 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kserve-agent" Apr 25 00:19:09.914158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.914092 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a612862-dbb6-4eaf-bf08-ee50f2f2df52" containerName="kube-rbac-proxy" Apr 25 00:19:09.916966 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.916948 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:09.919320 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.919303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 25 00:19:09.919404 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.919307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:19:09.925542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:09.925519 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f"] Apr 25 00:19:10.016322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.016285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.016322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.016322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcqgl\" (UniqueName: \"kubernetes.io/projected/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kube-api-access-tcqgl\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.016520 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.016344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/155b7ec0-7efe-479e-96a9-2d228ba80c2f-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.016520 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.016366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/155b7ec0-7efe-479e-96a9-2d228ba80c2f-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.117084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.117000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.117084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.117034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcqgl\" (UniqueName: \"kubernetes.io/projected/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kube-api-access-tcqgl\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.117084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.117060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/155b7ec0-7efe-479e-96a9-2d228ba80c2f-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.117084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.117079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/155b7ec0-7efe-479e-96a9-2d228ba80c2f-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.117475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.117446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.117772 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.117748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/155b7ec0-7efe-479e-96a9-2d228ba80c2f-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.119564 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.119546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/155b7ec0-7efe-479e-96a9-2d228ba80c2f-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.124704 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.124686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcqgl\" (UniqueName: \"kubernetes.io/projected/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kube-api-access-tcqgl\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.227165 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.227132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:10.351023 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.351000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f"] Apr 25 00:19:10.353434 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:19:10.353403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod155b7ec0_7efe_479e_96a9_2d228ba80c2f.slice/crio-21eb338f921c48802c7f6b49b612182ff4698a2b4af5f86ed22c85aa1a44710d WatchSource:0}: Error finding container 21eb338f921c48802c7f6b49b612182ff4698a2b4af5f86ed22c85aa1a44710d: Status 404 returned error can't find the container with id 21eb338f921c48802c7f6b49b612182ff4698a2b4af5f86ed22c85aa1a44710d Apr 25 00:19:10.743763 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.743666 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerID="9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5" exitCode=2 Apr 25 00:19:10.743763 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.743735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerDied","Data":"9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5"} Apr 25 00:19:10.745098 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.745065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerStarted","Data":"6dec1efe1858d68d2561223db5c00c68628a5ca347d961190455d6a00f158c69"} Apr 25 00:19:10.745098 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:10.745088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerStarted","Data":"21eb338f921c48802c7f6b49b612182ff4698a2b4af5f86ed22c85aa1a44710d"} Apr 25 00:19:11.586624 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:11.586582 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 25 00:19:11.592436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:11.592397 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 25 00:19:12.363633 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.363612 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:19:12.434545 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.434518 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c6ac957c-316d-418b-a10b-fc8abdba63e0-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"c6ac957c-316d-418b-a10b-fc8abdba63e0\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " Apr 25 00:19:12.434662 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.434555 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6ac957c-316d-418b-a10b-fc8abdba63e0-proxy-tls\") pod \"c6ac957c-316d-418b-a10b-fc8abdba63e0\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " Apr 25 00:19:12.434662 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.434601 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ac957c-316d-418b-a10b-fc8abdba63e0-kserve-provision-location\") pod \"c6ac957c-316d-418b-a10b-fc8abdba63e0\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " Apr 25 00:19:12.434662 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.434637 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2jrh\" (UniqueName: \"kubernetes.io/projected/c6ac957c-316d-418b-a10b-fc8abdba63e0-kube-api-access-l2jrh\") pod \"c6ac957c-316d-418b-a10b-fc8abdba63e0\" (UID: \"c6ac957c-316d-418b-a10b-fc8abdba63e0\") " Apr 25 00:19:12.434882 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.434858 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ac957c-316d-418b-a10b-fc8abdba63e0-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "c6ac957c-316d-418b-a10b-fc8abdba63e0" (UID: "c6ac957c-316d-418b-a10b-fc8abdba63e0"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:19:12.436849 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.436793 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ac957c-316d-418b-a10b-fc8abdba63e0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c6ac957c-316d-418b-a10b-fc8abdba63e0" (UID: "c6ac957c-316d-418b-a10b-fc8abdba63e0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:19:12.436849 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.436795 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ac957c-316d-418b-a10b-fc8abdba63e0-kube-api-access-l2jrh" (OuterVolumeSpecName: "kube-api-access-l2jrh") pod "c6ac957c-316d-418b-a10b-fc8abdba63e0" (UID: "c6ac957c-316d-418b-a10b-fc8abdba63e0"). InnerVolumeSpecName "kube-api-access-l2jrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:19:12.449220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.449196 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ac957c-316d-418b-a10b-fc8abdba63e0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6ac957c-316d-418b-a10b-fc8abdba63e0" (UID: "c6ac957c-316d-418b-a10b-fc8abdba63e0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:19:12.535881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.535845 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ac957c-316d-418b-a10b-fc8abdba63e0-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:19:12.535881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.535878 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2jrh\" (UniqueName: \"kubernetes.io/projected/c6ac957c-316d-418b-a10b-fc8abdba63e0-kube-api-access-l2jrh\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:19:12.535881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.535890 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c6ac957c-316d-418b-a10b-fc8abdba63e0-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:19:12.536072 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.535900 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6ac957c-316d-418b-a10b-fc8abdba63e0-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:19:12.752057 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.751972 2576 generic.go:358] "Generic (PLEG): container finished" podID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerID="922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7" exitCode=0 Apr 25 00:19:12.752403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.752055 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" Apr 25 00:19:12.752403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.752055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerDied","Data":"922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7"} Apr 25 00:19:12.752403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.752158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8" event={"ID":"c6ac957c-316d-418b-a10b-fc8abdba63e0","Type":"ContainerDied","Data":"d38724075d809129586953baec4ff9ad2c21e75989f13cb70a0a36b1a4eb6596"} Apr 25 00:19:12.752403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.752175 2576 scope.go:117] "RemoveContainer" containerID="9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5" Apr 25 00:19:12.760134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.760116 2576 scope.go:117] "RemoveContainer" containerID="922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7" Apr 25 00:19:12.767040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.767026 2576 scope.go:117] "RemoveContainer" containerID="e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6" Apr 25 00:19:12.772189 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.772169 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8"] Apr 25 00:19:12.774040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.774024 2576 scope.go:117] "RemoveContainer" containerID="9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5" Apr 25 00:19:12.774302 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:19:12.774283 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5\": container with ID starting with 9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5 not found: ID does not exist" containerID="9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5" Apr 25 00:19:12.774339 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.774310 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5"} err="failed to get container status \"9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5\": rpc error: code = NotFound desc = could not find container \"9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5\": container with ID starting with 9e25d242bfad6703ebe0d6a7cc32ca47274ed06d0d22a9c65257dc2cbd519ac5 not found: ID does not exist" Apr 25 00:19:12.774339 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.774328 2576 scope.go:117] "RemoveContainer" containerID="922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7" Apr 25 00:19:12.774578 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:19:12.774558 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7\": container with ID starting with 922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7 not found: ID does not exist" containerID="922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7" Apr 25 00:19:12.774634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.774585 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7"} err="failed to get container status \"922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7\": rpc error: code = NotFound desc = could not find container \"922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7\": container with ID starting with 922e9498658eeee2fdaf1675fef47a71c8a887d2151c5fade6271ef08c845fc7 not found: ID does not exist" Apr 25 00:19:12.774634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.774603 2576 scope.go:117] "RemoveContainer" containerID="e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6" Apr 25 00:19:12.774829 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:19:12.774813 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6\": container with ID starting with e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6 not found: ID does not exist" containerID="e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6" Apr 25 00:19:12.774868 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.774833 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6"} err="failed to get container status \"e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6\": rpc error: code = NotFound desc = could not find container \"e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6\": container with ID starting with e6777a14ada05d7cd232d30964e63f773849ec29d714f1c7b19fe34c79913ca6 not found: ID does not exist" Apr 25 00:19:12.778906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:12.778887 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-776w8"] Apr 25 00:19:14.315637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:14.315601 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" path="/var/lib/kubelet/pods/c6ac957c-316d-418b-a10b-fc8abdba63e0/volumes" Apr 25 00:19:15.762930 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:15.762892 2576 generic.go:358] "Generic (PLEG): container finished" podID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerID="6dec1efe1858d68d2561223db5c00c68628a5ca347d961190455d6a00f158c69" exitCode=0 Apr 25 00:19:15.763291 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:15.762983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerDied","Data":"6dec1efe1858d68d2561223db5c00c68628a5ca347d961190455d6a00f158c69"} Apr 25 00:19:16.767352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:16.767319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerStarted","Data":"2b5eb7562fc6adae2b3a07a2925eabeb786d54c40500cd3f8ffabda165555df9"} Apr 25 00:19:16.767352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:16.767357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerStarted","Data":"d85365f6165c4caa4b44d92f56462d3456d361809bbe250faf9264d27ca55ca8"} Apr 25 00:19:16.767753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:16.767604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:16.786390 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:16.786327 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podStartSLOduration=7.786309341 podStartE2EDuration="7.786309341s" podCreationTimestamp="2026-04-25 00:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:19:16.784031264 +0000 UTC m=+1517.065148186" watchObservedRunningTime="2026-04-25 00:19:16.786309341 +0000 UTC m=+1517.067426262" Apr 25 00:19:17.769926 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:17.769896 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:17.770850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:17.770824 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:19:18.773046 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:18.773006 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:19:23.777698 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:23.777670 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:19:23.778327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:23.778302 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:19:33.778996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:33.778947 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:19:37.049556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:37.049527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:19:37.050423 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:37.050399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:19:43.779146 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:43.779098 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:19:53.778950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:19:53.778893 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:20:03.779003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:03.778976 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:20:11.333937 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.333896 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f"] Apr 25 00:20:11.334576 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.334223 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" containerID="cri-o://d85365f6165c4caa4b44d92f56462d3456d361809bbe250faf9264d27ca55ca8" gracePeriod=30 Apr 25 00:20:11.334576 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.334273 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kube-rbac-proxy" containerID="cri-o://2b5eb7562fc6adae2b3a07a2925eabeb786d54c40500cd3f8ffabda165555df9" gracePeriod=30 Apr 25 00:20:11.422264 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw"] Apr 25 00:20:11.422514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422503 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="storage-initializer" Apr 25 00:20:11.422556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422516 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="storage-initializer" Apr 25 00:20:11.422556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422526 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" Apr 25 00:20:11.422556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422531 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" Apr 25 00:20:11.422556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422538 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kube-rbac-proxy" Apr 25 00:20:11.422556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422544 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kube-rbac-proxy" Apr 25 00:20:11.422727 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422590 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kube-rbac-proxy" Apr 25 00:20:11.422727 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.422603 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6ac957c-316d-418b-a10b-fc8abdba63e0" containerName="kserve-container" Apr 25 00:20:11.426762 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.426740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.429167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.429148 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 25 00:20:11.429291 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.429171 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 25 00:20:11.436058 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.436037 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw"] Apr 25 00:20:11.459441 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.459420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/498109a8-1613-42a0-8e65-4cc5a73988b0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.459541 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.459465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/498109a8-1613-42a0-8e65-4cc5a73988b0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.459541 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.459487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68zbf\" (UniqueName: \"kubernetes.io/projected/498109a8-1613-42a0-8e65-4cc5a73988b0-kube-api-access-68zbf\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.459541 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.459521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.560886 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.560839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.561083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.560945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/498109a8-1613-42a0-8e65-4cc5a73988b0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.561083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.560983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/498109a8-1613-42a0-8e65-4cc5a73988b0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.561083 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:20:11.561005 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-serving-cert: secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 25 00:20:11.561224 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:20:11.561088 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls podName:498109a8-1613-42a0-8e65-4cc5a73988b0 nodeName:}" failed. No retries permitted until 2026-04-25 00:20:12.061068977 +0000 UTC m=+1572.342185879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls") pod "isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" (UID: "498109a8-1613-42a0-8e65-4cc5a73988b0") : secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 25 00:20:11.561224 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.561007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68zbf\" (UniqueName: \"kubernetes.io/projected/498109a8-1613-42a0-8e65-4cc5a73988b0-kube-api-access-68zbf\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.561398 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.561378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/498109a8-1613-42a0-8e65-4cc5a73988b0-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.561629 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.561610 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/498109a8-1613-42a0-8e65-4cc5a73988b0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.569544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.569526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68zbf\" (UniqueName: \"kubernetes.io/projected/498109a8-1613-42a0-8e65-4cc5a73988b0-kube-api-access-68zbf\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:11.922017 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.921981 2576 generic.go:358] "Generic (PLEG): container finished" podID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerID="2b5eb7562fc6adae2b3a07a2925eabeb786d54c40500cd3f8ffabda165555df9" exitCode=2 Apr 25 00:20:11.922017 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:11.922021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerDied","Data":"2b5eb7562fc6adae2b3a07a2925eabeb786d54c40500cd3f8ffabda165555df9"} Apr 25 00:20:12.065906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:12.065872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:12.068429 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:12.068408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:12.337143 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:12.337111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:12.457138 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:12.457114 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw"] Apr 25 00:20:12.459513 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:20:12.459483 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod498109a8_1613_42a0_8e65_4cc5a73988b0.slice/crio-c5f9369836299ee70586be19c26693d6fdcd017b00c78f18d8cde43c1d63498c WatchSource:0}: Error finding container c5f9369836299ee70586be19c26693d6fdcd017b00c78f18d8cde43c1d63498c: Status 404 returned error can't find the container with id c5f9369836299ee70586be19c26693d6fdcd017b00c78f18d8cde43c1d63498c Apr 25 00:20:12.925845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:12.925802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerStarted","Data":"34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0"} Apr 25 00:20:12.925845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:12.925837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerStarted","Data":"c5f9369836299ee70586be19c26693d6fdcd017b00c78f18d8cde43c1d63498c"} Apr 25 00:20:13.773458 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:13.773420 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 25 00:20:13.778810 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:13.778787 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 25 00:20:13.930689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:13.930656 2576 generic.go:358] "Generic (PLEG): container finished" podID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerID="d85365f6165c4caa4b44d92f56462d3456d361809bbe250faf9264d27ca55ca8" exitCode=0 Apr 25 00:20:13.930814 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:13.930726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerDied","Data":"d85365f6165c4caa4b44d92f56462d3456d361809bbe250faf9264d27ca55ca8"} Apr 25 00:20:13.969114 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:13.969095 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:20:14.081575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.081496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kserve-provision-location\") pod \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " Apr 25 00:20:14.081575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.081534 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/155b7ec0-7efe-479e-96a9-2d228ba80c2f-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " Apr 25 00:20:14.081768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.081584 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/155b7ec0-7efe-479e-96a9-2d228ba80c2f-proxy-tls\") pod \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " Apr 25 00:20:14.081768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.081617 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcqgl\" (UniqueName: \"kubernetes.io/projected/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kube-api-access-tcqgl\") pod \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\" (UID: \"155b7ec0-7efe-479e-96a9-2d228ba80c2f\") " Apr 25 00:20:14.081939 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.081894 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/155b7ec0-7efe-479e-96a9-2d228ba80c2f-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "155b7ec0-7efe-479e-96a9-2d228ba80c2f" (UID: "155b7ec0-7efe-479e-96a9-2d228ba80c2f"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:20:14.083709 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.083689 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/155b7ec0-7efe-479e-96a9-2d228ba80c2f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "155b7ec0-7efe-479e-96a9-2d228ba80c2f" (UID: "155b7ec0-7efe-479e-96a9-2d228ba80c2f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:20:14.083928 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.083892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kube-api-access-tcqgl" (OuterVolumeSpecName: "kube-api-access-tcqgl") pod "155b7ec0-7efe-479e-96a9-2d228ba80c2f" (UID: "155b7ec0-7efe-479e-96a9-2d228ba80c2f"). InnerVolumeSpecName "kube-api-access-tcqgl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:20:14.091546 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.091524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "155b7ec0-7efe-479e-96a9-2d228ba80c2f" (UID: "155b7ec0-7efe-479e-96a9-2d228ba80c2f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:20:14.182544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.182521 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:20:14.182634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.182548 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/155b7ec0-7efe-479e-96a9-2d228ba80c2f-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:20:14.182634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.182559 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/155b7ec0-7efe-479e-96a9-2d228ba80c2f-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:20:14.182634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.182568 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tcqgl\" (UniqueName: \"kubernetes.io/projected/155b7ec0-7efe-479e-96a9-2d228ba80c2f-kube-api-access-tcqgl\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:20:14.935171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.935141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" event={"ID":"155b7ec0-7efe-479e-96a9-2d228ba80c2f","Type":"ContainerDied","Data":"21eb338f921c48802c7f6b49b612182ff4698a2b4af5f86ed22c85aa1a44710d"} Apr 25 00:20:14.935623 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.935180 2576 scope.go:117] "RemoveContainer" containerID="2b5eb7562fc6adae2b3a07a2925eabeb786d54c40500cd3f8ffabda165555df9" Apr 25 00:20:14.935623 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.935192 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f" Apr 25 00:20:14.943135 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.943116 2576 scope.go:117] "RemoveContainer" containerID="d85365f6165c4caa4b44d92f56462d3456d361809bbe250faf9264d27ca55ca8" Apr 25 00:20:14.949977 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.949949 2576 scope.go:117] "RemoveContainer" containerID="6dec1efe1858d68d2561223db5c00c68628a5ca347d961190455d6a00f158c69" Apr 25 00:20:14.952729 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.952706 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f"] Apr 25 00:20:14.957033 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:14.957014 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-lrr2f"] Apr 25 00:20:16.315935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:16.315892 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" path="/var/lib/kubelet/pods/155b7ec0-7efe-479e-96a9-2d228ba80c2f/volumes" Apr 25 00:20:16.943319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:16.943287 2576 generic.go:358] "Generic (PLEG): container finished" podID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerID="34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0" exitCode=0 Apr 25 00:20:16.943475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:16.943337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerDied","Data":"34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0"} Apr 25 00:20:17.948065 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:17.948029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerStarted","Data":"da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e"} Apr 25 00:20:17.948065 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:17.948069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerStarted","Data":"7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87"} Apr 25 00:20:17.948440 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:17.948342 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:17.948500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:17.948475 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:17.949681 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:17.949658 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:20:17.966495 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:17.966426 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podStartSLOduration=6.966413431 podStartE2EDuration="6.966413431s" podCreationTimestamp="2026-04-25 00:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:20:17.965121768 +0000 UTC m=+1578.246238692" watchObservedRunningTime="2026-04-25 00:20:17.966413431 +0000 UTC m=+1578.247530351" Apr 25 00:20:18.950693 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:18.950652 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:20:23.955256 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:23.955224 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:20:23.955722 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:23.955696 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:20:33.955735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:33.955699 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:20:43.956174 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:43.956134 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:20:53.955957 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:20:53.955898 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:21:03.957050 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:03.957020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:21:13.042004 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.041971 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw"] Apr 25 00:21:13.042374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.042314 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" containerID="cri-o://7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87" gracePeriod=30 Apr 25 00:21:13.042437 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.042348 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kube-rbac-proxy" containerID="cri-o://da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e" gracePeriod=30 Apr 25 00:21:13.145404 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145370 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g"] Apr 25 00:21:13.145676 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145664 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kube-rbac-proxy" Apr 25 00:21:13.145716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145678 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kube-rbac-proxy" Apr 25 00:21:13.145716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145689 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="storage-initializer" Apr 25 00:21:13.145716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145694 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="storage-initializer" Apr 25 00:21:13.145716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145702 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" Apr 25 00:21:13.145716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145710 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" Apr 25 00:21:13.145980 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145772 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kube-rbac-proxy" Apr 25 00:21:13.145980 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.145785 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="155b7ec0-7efe-479e-96a9-2d228ba80c2f" containerName="kserve-container" Apr 25 00:21:13.150295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.150274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.152556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.152536 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 25 00:21:13.152556 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.152549 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 25 00:21:13.157363 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.157334 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g"] Apr 25 00:21:13.218544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.218510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db14f507-793c-418f-8098-0cabb29c618d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.218656 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.218546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.218656 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.218577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhz2x\" (UniqueName: \"kubernetes.io/projected/db14f507-793c-418f-8098-0cabb29c618d-kube-api-access-nhz2x\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.218731 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.218693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db14f507-793c-418f-8098-0cabb29c618d-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.319743 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.319672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhz2x\" (UniqueName: \"kubernetes.io/projected/db14f507-793c-418f-8098-0cabb29c618d-kube-api-access-nhz2x\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.319743 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.319726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db14f507-793c-418f-8098-0cabb29c618d-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.319883 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.319763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db14f507-793c-418f-8098-0cabb29c618d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.319883 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.319794 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.320004 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:21:13.319905 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 25 00:21:13.320004 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:21:13.319993 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls podName:db14f507-793c-418f-8098-0cabb29c618d nodeName:}" failed. No retries permitted until 2026-04-25 00:21:13.819971203 +0000 UTC m=+1634.101088104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls") pod "isvc-pmml-predictor-8bb578669-9rk2g" (UID: "db14f507-793c-418f-8098-0cabb29c618d") : secret "isvc-pmml-predictor-serving-cert" not found Apr 25 00:21:13.320225 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.320207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db14f507-793c-418f-8098-0cabb29c618d-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.320453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.320435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db14f507-793c-418f-8098-0cabb29c618d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.328568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.328544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhz2x\" (UniqueName: \"kubernetes.io/projected/db14f507-793c-418f-8098-0cabb29c618d-kube-api-access-nhz2x\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.824003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.823974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.826401 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.826374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-9rk2g\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:13.951385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.951347 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 25 00:21:13.955634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:13.955610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 25 00:21:14.062438 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:14.062392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:14.107256 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:14.107178 2576 generic.go:358] "Generic (PLEG): container finished" podID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerID="da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e" exitCode=2 Apr 25 00:21:14.107395 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:14.107256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerDied","Data":"da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e"} Apr 25 00:21:14.179754 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:14.179623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g"] Apr 25 00:21:14.181873 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:21:14.181850 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb14f507_793c_418f_8098_0cabb29c618d.slice/crio-b5cb03590a1750052a8c5e4684f1ba0498d7c90e07fefa7ad473bee3fca2ac5a WatchSource:0}: Error finding container b5cb03590a1750052a8c5e4684f1ba0498d7c90e07fefa7ad473bee3fca2ac5a: Status 404 returned error can't find the container with id b5cb03590a1750052a8c5e4684f1ba0498d7c90e07fefa7ad473bee3fca2ac5a Apr 25 00:21:14.183676 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:14.183660 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:21:15.111289 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.111250 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerStarted","Data":"bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8"} Apr 25 00:21:15.111289 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.111288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerStarted","Data":"b5cb03590a1750052a8c5e4684f1ba0498d7c90e07fefa7ad473bee3fca2ac5a"} Apr 25 00:21:15.681160 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.681135 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:21:15.738459 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.738427 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/498109a8-1613-42a0-8e65-4cc5a73988b0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"498109a8-1613-42a0-8e65-4cc5a73988b0\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " Apr 25 00:21:15.738599 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.738472 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls\") pod \"498109a8-1613-42a0-8e65-4cc5a73988b0\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " Apr 25 00:21:15.738599 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.738515 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68zbf\" (UniqueName: \"kubernetes.io/projected/498109a8-1613-42a0-8e65-4cc5a73988b0-kube-api-access-68zbf\") pod \"498109a8-1613-42a0-8e65-4cc5a73988b0\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " Apr 25 00:21:15.738599 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.738562 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/498109a8-1613-42a0-8e65-4cc5a73988b0-kserve-provision-location\") pod \"498109a8-1613-42a0-8e65-4cc5a73988b0\" (UID: \"498109a8-1613-42a0-8e65-4cc5a73988b0\") " Apr 25 00:21:15.738879 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.738858 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/498109a8-1613-42a0-8e65-4cc5a73988b0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "498109a8-1613-42a0-8e65-4cc5a73988b0" (UID: "498109a8-1613-42a0-8e65-4cc5a73988b0"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:21:15.740745 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.740713 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "498109a8-1613-42a0-8e65-4cc5a73988b0" (UID: "498109a8-1613-42a0-8e65-4cc5a73988b0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:21:15.740842 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.740748 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498109a8-1613-42a0-8e65-4cc5a73988b0-kube-api-access-68zbf" (OuterVolumeSpecName: "kube-api-access-68zbf") pod "498109a8-1613-42a0-8e65-4cc5a73988b0" (UID: "498109a8-1613-42a0-8e65-4cc5a73988b0"). InnerVolumeSpecName "kube-api-access-68zbf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:21:15.748071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.748043 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/498109a8-1613-42a0-8e65-4cc5a73988b0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "498109a8-1613-42a0-8e65-4cc5a73988b0" (UID: "498109a8-1613-42a0-8e65-4cc5a73988b0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:21:15.839796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.839765 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/498109a8-1613-42a0-8e65-4cc5a73988b0-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:21:15.839796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.839797 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68zbf\" (UniqueName: \"kubernetes.io/projected/498109a8-1613-42a0-8e65-4cc5a73988b0-kube-api-access-68zbf\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:21:15.839796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.839808 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/498109a8-1613-42a0-8e65-4cc5a73988b0-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:21:15.840001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:15.839818 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/498109a8-1613-42a0-8e65-4cc5a73988b0-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:21:16.115403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.115370 2576 generic.go:358] "Generic (PLEG): container finished" podID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerID="7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87" exitCode=0 Apr 25 00:21:16.115780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.115460 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" Apr 25 00:21:16.115780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.115501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerDied","Data":"7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87"} Apr 25 00:21:16.115780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.115540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw" event={"ID":"498109a8-1613-42a0-8e65-4cc5a73988b0","Type":"ContainerDied","Data":"c5f9369836299ee70586be19c26693d6fdcd017b00c78f18d8cde43c1d63498c"} Apr 25 00:21:16.115780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.115555 2576 scope.go:117] "RemoveContainer" containerID="da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e" Apr 25 00:21:16.124095 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.124072 2576 scope.go:117] "RemoveContainer" containerID="7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87" Apr 25 00:21:16.136063 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.136046 2576 scope.go:117] "RemoveContainer" containerID="34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0" Apr 25 00:21:16.140666 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.140639 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw"] Apr 25 00:21:16.143079 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.143055 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-dxvcw"] Apr 25 00:21:16.144560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.144544 2576 scope.go:117] "RemoveContainer" containerID="da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e" Apr 25 00:21:16.144789 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:21:16.144770 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e\": container with ID starting with da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e not found: ID does not exist" containerID="da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e" Apr 25 00:21:16.144834 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.144798 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e"} err="failed to get container status \"da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e\": rpc error: code = NotFound desc = could not find container \"da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e\": container with ID starting with da02fcd2beeddada82a86e6a5cf014b730036bc9f1feee395c073c70c56cdd0e not found: ID does not exist" Apr 25 00:21:16.144834 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.144816 2576 scope.go:117] "RemoveContainer" containerID="7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87" Apr 25 00:21:16.145085 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:21:16.145069 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87\": container with ID starting with 7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87 not found: ID does not exist" containerID="7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87" Apr 25 00:21:16.145144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.145088 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87"} err="failed to get container status \"7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87\": rpc error: code = NotFound desc = could not find container \"7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87\": container with ID starting with 7db96fdfe87789aaa7352a18b248887e4559abf54e45e93c9fc59bedb0f73d87 not found: ID does not exist" Apr 25 00:21:16.145144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.145103 2576 scope.go:117] "RemoveContainer" containerID="34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0" Apr 25 00:21:16.145292 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:21:16.145277 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0\": container with ID starting with 34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0 not found: ID does not exist" containerID="34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0" Apr 25 00:21:16.145335 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.145296 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0"} err="failed to get container status \"34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0\": rpc error: code = NotFound desc = could not find container \"34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0\": container with ID starting with 34f1fbcbfec2229c654dc96625bd7816e4bf5a4d7d3e538cfbf1be6a59ea8fb0 not found: ID does not exist" Apr 25 00:21:16.314716 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:16.314688 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" path="/var/lib/kubelet/pods/498109a8-1613-42a0-8e65-4cc5a73988b0/volumes" Apr 25 00:21:18.123295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:18.123262 2576 generic.go:358] "Generic (PLEG): container finished" podID="db14f507-793c-418f-8098-0cabb29c618d" containerID="bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8" exitCode=0 Apr 25 00:21:18.123667 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:18.123340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerDied","Data":"bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8"} Apr 25 00:21:25.149356 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:25.149321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerStarted","Data":"dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351"} Apr 25 00:21:25.149664 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:25.149359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerStarted","Data":"a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186"} Apr 25 00:21:25.149719 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:25.149689 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:25.149758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:25.149723 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:25.151077 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:25.151055 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:21:25.166462 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:25.166424 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podStartSLOduration=5.265539133 podStartE2EDuration="12.16641231s" podCreationTimestamp="2026-04-25 00:21:13 +0000 UTC" firstStartedPulling="2026-04-25 00:21:18.124481572 +0000 UTC m=+1638.405598470" lastFinishedPulling="2026-04-25 00:21:25.025354749 +0000 UTC m=+1645.306471647" observedRunningTime="2026-04-25 00:21:25.165946449 +0000 UTC m=+1645.447063361" watchObservedRunningTime="2026-04-25 00:21:25.16641231 +0000 UTC m=+1645.447529293" Apr 25 00:21:26.153860 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:26.153810 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:21:31.157728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:31.157697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:21:31.158249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:31.158219 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:21:41.158891 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:41.158854 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:21:51.158547 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:21:51.158508 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:22:01.159248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:01.159209 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:22:11.158976 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:11.158868 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:22:21.158307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:21.158265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:22:31.158982 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:31.158939 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 25 00:22:41.159714 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:41.159681 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:22:44.222017 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.221970 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g"] Apr 25 00:22:44.222379 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.222291 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kube-rbac-proxy" containerID="cri-o://dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351" gracePeriod=30 Apr 25 00:22:44.222379 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.222274 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" containerID="cri-o://a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186" gracePeriod=30 Apr 25 00:22:44.338820 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.338793 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627"] Apr 25 00:22:44.339074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339062 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kube-rbac-proxy" Apr 25 00:22:44.339125 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339076 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kube-rbac-proxy" Apr 25 00:22:44.339125 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339094 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="storage-initializer" Apr 25 00:22:44.339125 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339100 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="storage-initializer" Apr 25 00:22:44.339125 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339112 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" Apr 25 00:22:44.339125 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339118 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" Apr 25 00:22:44.339276 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339166 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kserve-container" Apr 25 00:22:44.339276 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.339173 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="498109a8-1613-42a0-8e65-4cc5a73988b0" containerName="kube-rbac-proxy" Apr 25 00:22:44.341974 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.341956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.343971 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.343951 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 25 00:22:44.344069 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.344011 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:22:44.351090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.351067 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627"] Apr 25 00:22:44.372335 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.372308 2576 generic.go:358] "Generic (PLEG): container finished" podID="db14f507-793c-418f-8098-0cabb29c618d" containerID="dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351" exitCode=2 Apr 25 00:22:44.372433 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.372358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerDied","Data":"dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351"} Apr 25 00:22:44.459852 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.459826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfzgt\" (UniqueName: \"kubernetes.io/projected/39a29c48-ba2d-451e-9e58-c080cca209ac-kube-api-access-wfzgt\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.459995 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.459858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39a29c48-ba2d-451e-9e58-c080cca209ac-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.460086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.460066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39a29c48-ba2d-451e-9e58-c080cca209ac-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.460133 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.460101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39a29c48-ba2d-451e-9e58-c080cca209ac-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.561304 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.561273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39a29c48-ba2d-451e-9e58-c080cca209ac-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.561479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.561314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39a29c48-ba2d-451e-9e58-c080cca209ac-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.561479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.561357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfzgt\" (UniqueName: \"kubernetes.io/projected/39a29c48-ba2d-451e-9e58-c080cca209ac-kube-api-access-wfzgt\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.561479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.561385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39a29c48-ba2d-451e-9e58-c080cca209ac-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.561818 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.561795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39a29c48-ba2d-451e-9e58-c080cca209ac-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.562082 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.562064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39a29c48-ba2d-451e-9e58-c080cca209ac-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.563992 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.563974 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39a29c48-ba2d-451e-9e58-c080cca209ac-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.569368 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.569345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfzgt\" (UniqueName: \"kubernetes.io/projected/39a29c48-ba2d-451e-9e58-c080cca209ac-kube-api-access-wfzgt\") pod \"isvc-pmml-runtime-predictor-67bc544947-tr627\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.651697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.651663 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:44.773070 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:44.773047 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627"] Apr 25 00:22:44.774937 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:22:44.774900 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39a29c48_ba2d_451e_9e58_c080cca209ac.slice/crio-108e41c96170090793e015fe4a02b5978838fa683fd4d484f2dbc6784b3eeaa3 WatchSource:0}: Error finding container 108e41c96170090793e015fe4a02b5978838fa683fd4d484f2dbc6784b3eeaa3: Status 404 returned error can't find the container with id 108e41c96170090793e015fe4a02b5978838fa683fd4d484f2dbc6784b3eeaa3 Apr 25 00:22:45.376495 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:45.376454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerStarted","Data":"def61f2972972c1b3d952c3288a5ddbc4dda2af4eca22c5bbc9d952b14630d63"} Apr 25 00:22:45.376837 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:45.376508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerStarted","Data":"108e41c96170090793e015fe4a02b5978838fa683fd4d484f2dbc6784b3eeaa3"} Apr 25 00:22:46.154575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:46.154534 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 25 00:22:47.560554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.560534 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:22:47.685514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685425 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db14f507-793c-418f-8098-0cabb29c618d-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"db14f507-793c-418f-8098-0cabb29c618d\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " Apr 25 00:22:47.685514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhz2x\" (UniqueName: \"kubernetes.io/projected/db14f507-793c-418f-8098-0cabb29c618d-kube-api-access-nhz2x\") pod \"db14f507-793c-418f-8098-0cabb29c618d\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " Apr 25 00:22:47.685744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls\") pod \"db14f507-793c-418f-8098-0cabb29c618d\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " Apr 25 00:22:47.685744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db14f507-793c-418f-8098-0cabb29c618d-kserve-provision-location\") pod \"db14f507-793c-418f-8098-0cabb29c618d\" (UID: \"db14f507-793c-418f-8098-0cabb29c618d\") " Apr 25 00:22:47.685859 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685762 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db14f507-793c-418f-8098-0cabb29c618d-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "db14f507-793c-418f-8098-0cabb29c618d" (UID: "db14f507-793c-418f-8098-0cabb29c618d"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:22:47.685961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685944 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/db14f507-793c-418f-8098-0cabb29c618d-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:22:47.686012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.685980 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db14f507-793c-418f-8098-0cabb29c618d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "db14f507-793c-418f-8098-0cabb29c618d" (UID: "db14f507-793c-418f-8098-0cabb29c618d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:22:47.687843 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.687821 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db14f507-793c-418f-8098-0cabb29c618d" (UID: "db14f507-793c-418f-8098-0cabb29c618d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:22:47.687843 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.687832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db14f507-793c-418f-8098-0cabb29c618d-kube-api-access-nhz2x" (OuterVolumeSpecName: "kube-api-access-nhz2x") pod "db14f507-793c-418f-8098-0cabb29c618d" (UID: "db14f507-793c-418f-8098-0cabb29c618d"). InnerVolumeSpecName "kube-api-access-nhz2x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:22:47.786310 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.786280 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db14f507-793c-418f-8098-0cabb29c618d-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:22:47.786310 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.786306 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db14f507-793c-418f-8098-0cabb29c618d-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:22:47.786477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:47.786316 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhz2x\" (UniqueName: \"kubernetes.io/projected/db14f507-793c-418f-8098-0cabb29c618d-kube-api-access-nhz2x\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:22:48.385336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.385304 2576 generic.go:358] "Generic (PLEG): container finished" podID="db14f507-793c-418f-8098-0cabb29c618d" containerID="a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186" exitCode=0 Apr 25 00:22:48.385484 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.385381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerDied","Data":"a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186"} Apr 25 00:22:48.385484 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.385397 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" Apr 25 00:22:48.385484 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.385417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g" event={"ID":"db14f507-793c-418f-8098-0cabb29c618d","Type":"ContainerDied","Data":"b5cb03590a1750052a8c5e4684f1ba0498d7c90e07fefa7ad473bee3fca2ac5a"} Apr 25 00:22:48.385484 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.385433 2576 scope.go:117] "RemoveContainer" containerID="dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351" Apr 25 00:22:48.394867 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.394850 2576 scope.go:117] "RemoveContainer" containerID="a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186" Apr 25 00:22:48.402007 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.401976 2576 scope.go:117] "RemoveContainer" containerID="bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8" Apr 25 00:22:48.403557 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.403534 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g"] Apr 25 00:22:48.407049 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.406910 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-9rk2g"] Apr 25 00:22:48.409391 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.409373 2576 scope.go:117] "RemoveContainer" containerID="dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351" Apr 25 00:22:48.409627 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:22:48.409610 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351\": container with ID starting with dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351 not found: ID does not exist" containerID="dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351" Apr 25 00:22:48.409675 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.409636 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351"} err="failed to get container status \"dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351\": rpc error: code = NotFound desc = could not find container \"dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351\": container with ID starting with dcd8648d532c85902d1fb2ff3f21b7d4670268ed0bb1b1cfb9fc37ec47858351 not found: ID does not exist" Apr 25 00:22:48.409675 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.409654 2576 scope.go:117] "RemoveContainer" containerID="a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186" Apr 25 00:22:48.409954 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:22:48.409936 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186\": container with ID starting with a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186 not found: ID does not exist" containerID="a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186" Apr 25 00:22:48.410016 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.409960 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186"} err="failed to get container status \"a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186\": rpc error: code = NotFound desc = could not find container \"a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186\": container with ID starting with a76a55cccd233c10e28ab4ef428d8dc7bb9727f0595d3133adfb81c731dc2186 not found: ID does not exist" Apr 25 00:22:48.410016 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.409976 2576 scope.go:117] "RemoveContainer" containerID="bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8" Apr 25 00:22:48.410290 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:22:48.410269 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8\": container with ID starting with bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8 not found: ID does not exist" containerID="bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8" Apr 25 00:22:48.410336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:48.410297 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8"} err="failed to get container status \"bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8\": rpc error: code = NotFound desc = could not find container \"bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8\": container with ID starting with bb95556111d686d303c19a83ff97f10912bfe9cf5bb7ae56b852675f7fca51a8 not found: ID does not exist" Apr 25 00:22:49.389547 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:49.389511 2576 generic.go:358] "Generic (PLEG): container finished" podID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerID="def61f2972972c1b3d952c3288a5ddbc4dda2af4eca22c5bbc9d952b14630d63" exitCode=0 Apr 25 00:22:49.389959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:49.389583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerDied","Data":"def61f2972972c1b3d952c3288a5ddbc4dda2af4eca22c5bbc9d952b14630d63"} Apr 25 00:22:50.315054 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:50.315022 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db14f507-793c-418f-8098-0cabb29c618d" path="/var/lib/kubelet/pods/db14f507-793c-418f-8098-0cabb29c618d/volumes" Apr 25 00:22:50.394342 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:50.394311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerStarted","Data":"6541477dbf2c8630a90128771d40da0cd2d2407b4bed4c5ff3b157f4c13f37e5"} Apr 25 00:22:50.394692 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:50.394353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerStarted","Data":"14a0eee1a6cd59c58458cb11db6d82c8ed112b593000bee4eb4808da59bcbcd8"} Apr 25 00:22:50.394692 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:50.394555 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:50.412133 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:50.412075 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podStartSLOduration=6.41206141 podStartE2EDuration="6.41206141s" podCreationTimestamp="2026-04-25 00:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:22:50.410740149 +0000 UTC m=+1730.691857069" watchObservedRunningTime="2026-04-25 00:22:50.41206141 +0000 UTC m=+1730.693178331" Apr 25 00:22:51.397329 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:51.397295 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:51.398467 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:51.398441 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:22:52.400377 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:52.400336 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:22:57.405689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:57.405660 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:22:57.406290 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:22:57.406264 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:23:07.406584 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:23:07.406547 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:23:17.406572 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:23:17.406519 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:23:27.407004 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:23:27.406965 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:23:37.407208 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:23:37.407161 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:23:47.406629 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:23:47.406589 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:23:57.407194 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:23:57.407153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:24:07.407045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:07.407015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:24:15.331039 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.331003 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627"] Apr 25 00:24:15.331585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.331317 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" containerID="cri-o://14a0eee1a6cd59c58458cb11db6d82c8ed112b593000bee4eb4808da59bcbcd8" gracePeriod=30 Apr 25 00:24:15.331585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.331350 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kube-rbac-proxy" containerID="cri-o://6541477dbf2c8630a90128771d40da0cd2d2407b4bed4c5ff3b157f4c13f37e5" gracePeriod=30 Apr 25 00:24:15.425855 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.425820 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946"] Apr 25 00:24:15.426147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426134 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kube-rbac-proxy" Apr 25 00:24:15.426201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426149 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kube-rbac-proxy" Apr 25 00:24:15.426201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426164 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="storage-initializer" Apr 25 00:24:15.426201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426173 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="storage-initializer" Apr 25 00:24:15.426201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426185 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" Apr 25 00:24:15.426201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426190 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" Apr 25 00:24:15.426371 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426233 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kube-rbac-proxy" Apr 25 00:24:15.426371 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.426242 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="db14f507-793c-418f-8098-0cabb29c618d" containerName="kserve-container" Apr 25 00:24:15.429191 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.429174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.431447 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.431412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 25 00:24:15.431546 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.431421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 25 00:24:15.437400 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.437373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946"] Apr 25 00:24:15.534860 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.534819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2fm\" (UniqueName: \"kubernetes.io/projected/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kube-api-access-8c2fm\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.535068 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.534866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.535068 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.534938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865290dc-6b08-4e2a-b781-cb28a8dd57a0-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.535068 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.534981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/865290dc-6b08-4e2a-b781-cb28a8dd57a0-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.633615 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.633537 2576 generic.go:358] "Generic (PLEG): container finished" podID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerID="6541477dbf2c8630a90128771d40da0cd2d2407b4bed4c5ff3b157f4c13f37e5" exitCode=2 Apr 25 00:24:15.633749 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.633612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerDied","Data":"6541477dbf2c8630a90128771d40da0cd2d2407b4bed4c5ff3b157f4c13f37e5"} Apr 25 00:24:15.635897 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.635876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/865290dc-6b08-4e2a-b781-cb28a8dd57a0-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.636010 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.635965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2fm\" (UniqueName: \"kubernetes.io/projected/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kube-api-access-8c2fm\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.636010 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.635989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.636010 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.636007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865290dc-6b08-4e2a-b781-cb28a8dd57a0-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.636398 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.636373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.636562 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.636543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/865290dc-6b08-4e2a-b781-cb28a8dd57a0-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.638581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.638563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865290dc-6b08-4e2a-b781-cb28a8dd57a0-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.643965 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.643942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2fm\" (UniqueName: \"kubernetes.io/projected/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kube-api-access-8c2fm\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.740513 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.740483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:15.862087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:15.862062 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946"] Apr 25 00:24:15.864538 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:24:15.864510 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865290dc_6b08_4e2a_b781_cb28a8dd57a0.slice/crio-b8d1ea0a3b6da017d097ca94787731e395ac1abb0963066314b1632ffdc7ec33 WatchSource:0}: Error finding container b8d1ea0a3b6da017d097ca94787731e395ac1abb0963066314b1632ffdc7ec33: Status 404 returned error can't find the container with id b8d1ea0a3b6da017d097ca94787731e395ac1abb0963066314b1632ffdc7ec33 Apr 25 00:24:16.637961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:16.637907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerStarted","Data":"a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1"} Apr 25 00:24:16.637961 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:16.637963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerStarted","Data":"b8d1ea0a3b6da017d097ca94787731e395ac1abb0963066314b1632ffdc7ec33"} Apr 25 00:24:17.401560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:17.401520 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 25 00:24:17.406341 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:17.406316 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 25 00:24:18.645564 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.645539 2576 generic.go:358] "Generic (PLEG): container finished" podID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerID="14a0eee1a6cd59c58458cb11db6d82c8ed112b593000bee4eb4808da59bcbcd8" exitCode=0 Apr 25 00:24:18.645833 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.645616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerDied","Data":"14a0eee1a6cd59c58458cb11db6d82c8ed112b593000bee4eb4808da59bcbcd8"} Apr 25 00:24:18.767787 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.767765 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:24:18.860151 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.860082 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39a29c48-ba2d-451e-9e58-c080cca209ac-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"39a29c48-ba2d-451e-9e58-c080cca209ac\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " Apr 25 00:24:18.860151 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.860130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39a29c48-ba2d-451e-9e58-c080cca209ac-proxy-tls\") pod \"39a29c48-ba2d-451e-9e58-c080cca209ac\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " Apr 25 00:24:18.860314 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.860156 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfzgt\" (UniqueName: \"kubernetes.io/projected/39a29c48-ba2d-451e-9e58-c080cca209ac-kube-api-access-wfzgt\") pod \"39a29c48-ba2d-451e-9e58-c080cca209ac\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " Apr 25 00:24:18.860314 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.860179 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39a29c48-ba2d-451e-9e58-c080cca209ac-kserve-provision-location\") pod \"39a29c48-ba2d-451e-9e58-c080cca209ac\" (UID: \"39a29c48-ba2d-451e-9e58-c080cca209ac\") " Apr 25 00:24:18.860518 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.860490 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a29c48-ba2d-451e-9e58-c080cca209ac-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "39a29c48-ba2d-451e-9e58-c080cca209ac" (UID: "39a29c48-ba2d-451e-9e58-c080cca209ac"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:24:18.860518 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.860507 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a29c48-ba2d-451e-9e58-c080cca209ac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "39a29c48-ba2d-451e-9e58-c080cca209ac" (UID: "39a29c48-ba2d-451e-9e58-c080cca209ac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:24:18.862363 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.862330 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a29c48-ba2d-451e-9e58-c080cca209ac-kube-api-access-wfzgt" (OuterVolumeSpecName: "kube-api-access-wfzgt") pod "39a29c48-ba2d-451e-9e58-c080cca209ac" (UID: "39a29c48-ba2d-451e-9e58-c080cca209ac"). InnerVolumeSpecName "kube-api-access-wfzgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:24:18.862363 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.862354 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a29c48-ba2d-451e-9e58-c080cca209ac-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "39a29c48-ba2d-451e-9e58-c080cca209ac" (UID: "39a29c48-ba2d-451e-9e58-c080cca209ac"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:24:18.961558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.961530 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/39a29c48-ba2d-451e-9e58-c080cca209ac-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:24:18.961558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.961555 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfzgt\" (UniqueName: \"kubernetes.io/projected/39a29c48-ba2d-451e-9e58-c080cca209ac-kube-api-access-wfzgt\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:24:18.961695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.961564 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39a29c48-ba2d-451e-9e58-c080cca209ac-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:24:18.961695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:18.961575 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/39a29c48-ba2d-451e-9e58-c080cca209ac-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:24:19.650138 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.650098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" event={"ID":"39a29c48-ba2d-451e-9e58-c080cca209ac","Type":"ContainerDied","Data":"108e41c96170090793e015fe4a02b5978838fa683fd4d484f2dbc6784b3eeaa3"} Apr 25 00:24:19.650532 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.650153 2576 scope.go:117] "RemoveContainer" containerID="6541477dbf2c8630a90128771d40da0cd2d2407b4bed4c5ff3b157f4c13f37e5" Apr 25 00:24:19.650532 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.650171 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627" Apr 25 00:24:19.658658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.658637 2576 scope.go:117] "RemoveContainer" containerID="14a0eee1a6cd59c58458cb11db6d82c8ed112b593000bee4eb4808da59bcbcd8" Apr 25 00:24:19.665566 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.665544 2576 scope.go:117] "RemoveContainer" containerID="def61f2972972c1b3d952c3288a5ddbc4dda2af4eca22c5bbc9d952b14630d63" Apr 25 00:24:19.671383 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.671356 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627"] Apr 25 00:24:19.674479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:19.674452 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-tr627"] Apr 25 00:24:20.314901 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:20.314871 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" path="/var/lib/kubelet/pods/39a29c48-ba2d-451e-9e58-c080cca209ac/volumes" Apr 25 00:24:20.659350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:20.659266 2576 generic.go:358] "Generic (PLEG): container finished" podID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerID="a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1" exitCode=0 Apr 25 00:24:20.659350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:20.659327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerDied","Data":"a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1"} Apr 25 00:24:21.664012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:21.663974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerStarted","Data":"94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a"} Apr 25 00:24:21.664373 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:21.664019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerStarted","Data":"f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631"} Apr 25 00:24:21.664373 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:21.664261 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:22.667438 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:22.667405 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:22.668758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:22.668730 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:24:23.670112 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:23.670075 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:24:28.676284 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:28.676257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:24:28.676768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:28.676742 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:24:28.694502 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:28.694452 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podStartSLOduration=13.69443753 podStartE2EDuration="13.69443753s" podCreationTimestamp="2026-04-25 00:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:24:21.699044268 +0000 UTC m=+1821.980161187" watchObservedRunningTime="2026-04-25 00:24:28.69443753 +0000 UTC m=+1828.975554450" Apr 25 00:24:37.071858 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:37.071828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:24:37.074569 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:37.074546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:24:38.676973 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:38.676936 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:24:48.677434 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:48.677394 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:24:58.677156 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:24:58.677112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:25:08.677703 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:08.677663 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:25:18.677199 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:18.677095 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:25:28.676909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:28.676866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:25:31.312201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:31.312163 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 25 00:25:41.313544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:41.313510 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:25:46.451084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.451046 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946"] Apr 25 00:25:46.451482 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.451434 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" containerID="cri-o://f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631" gracePeriod=30 Apr 25 00:25:46.451550 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.451507 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kube-rbac-proxy" containerID="cri-o://94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a" gracePeriod=30 Apr 25 00:25:46.548271 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7"] Apr 25 00:25:46.548526 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548514 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kube-rbac-proxy" Apr 25 00:25:46.548568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548528 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kube-rbac-proxy" Apr 25 00:25:46.548568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548544 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="storage-initializer" Apr 25 00:25:46.548568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548550 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="storage-initializer" Apr 25 00:25:46.548568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548564 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" Apr 25 00:25:46.548693 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548570 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" Apr 25 00:25:46.548693 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548610 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kserve-container" Apr 25 00:25:46.548693 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.548617 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="39a29c48-ba2d-451e-9e58-c080cca209ac" containerName="kube-rbac-proxy" Apr 25 00:25:46.551662 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.551645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.554052 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.554012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-42458c-kube-rbac-proxy-sar-config\"" Apr 25 00:25:46.554408 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.554393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-42458c-predictor-serving-cert\"" Apr 25 00:25:46.561854 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.561828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7"] Apr 25 00:25:46.703048 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.702949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.703048 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.703029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/734f07a4-1148-403e-b849-b60ad7ec15f8-kserve-provision-location\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.703229 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.703061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/734f07a4-1148-403e-b849-b60ad7ec15f8-isvc-primary-42458c-kube-rbac-proxy-sar-config\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.703229 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.703101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdp4v\" (UniqueName: \"kubernetes.io/projected/734f07a4-1148-403e-b849-b60ad7ec15f8-kube-api-access-mdp4v\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.804263 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.804217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/734f07a4-1148-403e-b849-b60ad7ec15f8-kserve-provision-location\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.804452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.804272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/734f07a4-1148-403e-b849-b60ad7ec15f8-isvc-primary-42458c-kube-rbac-proxy-sar-config\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.804452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.804302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdp4v\" (UniqueName: \"kubernetes.io/projected/734f07a4-1148-403e-b849-b60ad7ec15f8-kube-api-access-mdp4v\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.804452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.804331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.804452 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:25:46.804442 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-42458c-predictor-serving-cert: secret "isvc-primary-42458c-predictor-serving-cert" not found Apr 25 00:25:46.804619 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:25:46.804507 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls podName:734f07a4-1148-403e-b849-b60ad7ec15f8 nodeName:}" failed. No retries permitted until 2026-04-25 00:25:47.304486577 +0000 UTC m=+1907.585603492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls") pod "isvc-primary-42458c-predictor-c86f7785d-rxtt7" (UID: "734f07a4-1148-403e-b849-b60ad7ec15f8") : secret "isvc-primary-42458c-predictor-serving-cert" not found Apr 25 00:25:46.804710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.804685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/734f07a4-1148-403e-b849-b60ad7ec15f8-kserve-provision-location\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.805000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.804983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/734f07a4-1148-403e-b849-b60ad7ec15f8-isvc-primary-42458c-kube-rbac-proxy-sar-config\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.815171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.815149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdp4v\" (UniqueName: \"kubernetes.io/projected/734f07a4-1148-403e-b849-b60ad7ec15f8-kube-api-access-mdp4v\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:46.899695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.899662 2576 generic.go:358] "Generic (PLEG): container finished" podID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerID="94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a" exitCode=2 Apr 25 00:25:46.899853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:46.899724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerDied","Data":"94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a"} Apr 25 00:25:47.308024 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:47.307984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:47.310617 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:47.310593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls\") pod \"isvc-primary-42458c-predictor-c86f7785d-rxtt7\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:47.464441 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:47.464410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:47.586527 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:47.586501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7"] Apr 25 00:25:47.589288 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:25:47.589260 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod734f07a4_1148_403e_b849_b60ad7ec15f8.slice/crio-2e9d33dc1b72ade0b475fcc2599b981ae723da4e1e5c78a3ffa38205fe245bca WatchSource:0}: Error finding container 2e9d33dc1b72ade0b475fcc2599b981ae723da4e1e5c78a3ffa38205fe245bca: Status 404 returned error can't find the container with id 2e9d33dc1b72ade0b475fcc2599b981ae723da4e1e5c78a3ffa38205fe245bca Apr 25 00:25:47.904110 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:47.904015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerStarted","Data":"60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931"} Apr 25 00:25:47.904110 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:47.904052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerStarted","Data":"2e9d33dc1b72ade0b475fcc2599b981ae723da4e1e5c78a3ffa38205fe245bca"} Apr 25 00:25:48.671047 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:48.671004 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 25 00:25:50.086302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.086278 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:25:50.231005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.230905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865290dc-6b08-4e2a-b781-cb28a8dd57a0-proxy-tls\") pod \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " Apr 25 00:25:50.231005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.230975 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c2fm\" (UniqueName: \"kubernetes.io/projected/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kube-api-access-8c2fm\") pod \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " Apr 25 00:25:50.231005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.231001 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kserve-provision-location\") pod \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " Apr 25 00:25:50.231273 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.231034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/865290dc-6b08-4e2a-b781-cb28a8dd57a0-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\" (UID: \"865290dc-6b08-4e2a-b781-cb28a8dd57a0\") " Apr 25 00:25:50.231349 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.231324 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "865290dc-6b08-4e2a-b781-cb28a8dd57a0" (UID: "865290dc-6b08-4e2a-b781-cb28a8dd57a0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:25:50.231430 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.231410 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865290dc-6b08-4e2a-b781-cb28a8dd57a0-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "865290dc-6b08-4e2a-b781-cb28a8dd57a0" (UID: "865290dc-6b08-4e2a-b781-cb28a8dd57a0"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:25:50.233229 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.233209 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865290dc-6b08-4e2a-b781-cb28a8dd57a0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "865290dc-6b08-4e2a-b781-cb28a8dd57a0" (UID: "865290dc-6b08-4e2a-b781-cb28a8dd57a0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:25:50.233307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.233245 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kube-api-access-8c2fm" (OuterVolumeSpecName: "kube-api-access-8c2fm") pod "865290dc-6b08-4e2a-b781-cb28a8dd57a0" (UID: "865290dc-6b08-4e2a-b781-cb28a8dd57a0"). InnerVolumeSpecName "kube-api-access-8c2fm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:25:50.332351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.332327 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8c2fm\" (UniqueName: \"kubernetes.io/projected/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kube-api-access-8c2fm\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:25:50.332351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.332351 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/865290dc-6b08-4e2a-b781-cb28a8dd57a0-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:25:50.332496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.332366 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/865290dc-6b08-4e2a-b781-cb28a8dd57a0-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:25:50.332496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.332382 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/865290dc-6b08-4e2a-b781-cb28a8dd57a0-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:25:50.915505 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.915473 2576 generic.go:358] "Generic (PLEG): container finished" podID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerID="f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631" exitCode=0 Apr 25 00:25:50.915721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.915535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerDied","Data":"f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631"} Apr 25 00:25:50.915721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.915547 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" Apr 25 00:25:50.915721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.915568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946" event={"ID":"865290dc-6b08-4e2a-b781-cb28a8dd57a0","Type":"ContainerDied","Data":"b8d1ea0a3b6da017d097ca94787731e395ac1abb0963066314b1632ffdc7ec33"} Apr 25 00:25:50.915721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.915591 2576 scope.go:117] "RemoveContainer" containerID="94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a" Apr 25 00:25:50.923320 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.923300 2576 scope.go:117] "RemoveContainer" containerID="f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631" Apr 25 00:25:50.930453 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.930431 2576 scope.go:117] "RemoveContainer" containerID="a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1" Apr 25 00:25:50.932833 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.932811 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946"] Apr 25 00:25:50.936941 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.936907 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-hg946"] Apr 25 00:25:50.938754 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.938732 2576 scope.go:117] "RemoveContainer" containerID="94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a" Apr 25 00:25:50.939033 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:25:50.939017 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a\": container with ID starting with 94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a not found: ID does not exist" containerID="94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a" Apr 25 00:25:50.939090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.939049 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a"} err="failed to get container status \"94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a\": rpc error: code = NotFound desc = could not find container \"94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a\": container with ID starting with 94e42b0605e134e71b54cdf121be7d02f5fc2982b5760834ce16cbaefbe9bd5a not found: ID does not exist" Apr 25 00:25:50.939090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.939067 2576 scope.go:117] "RemoveContainer" containerID="f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631" Apr 25 00:25:50.939312 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:25:50.939295 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631\": container with ID starting with f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631 not found: ID does not exist" containerID="f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631" Apr 25 00:25:50.939357 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.939318 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631"} err="failed to get container status \"f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631\": rpc error: code = NotFound desc = could not find container \"f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631\": container with ID starting with f7bdef99d5a02138a6a21c8909b455695cac6699b935870c4f0a6ba781d21631 not found: ID does not exist" Apr 25 00:25:50.939357 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.939334 2576 scope.go:117] "RemoveContainer" containerID="a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1" Apr 25 00:25:50.939527 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:25:50.939508 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1\": container with ID starting with a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1 not found: ID does not exist" containerID="a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1" Apr 25 00:25:50.939568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:50.939530 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1"} err="failed to get container status \"a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1\": rpc error: code = NotFound desc = could not find container \"a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1\": container with ID starting with a7c7cb61b106e1740568c6d968cc3dbc4c60aaf55985e7349a91388ab79d1ae1 not found: ID does not exist" Apr 25 00:25:51.924739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:51.924703 2576 generic.go:358] "Generic (PLEG): container finished" podID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerID="60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931" exitCode=0 Apr 25 00:25:51.925187 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:51.924766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerDied","Data":"60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931"} Apr 25 00:25:52.315408 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:52.315374 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" path="/var/lib/kubelet/pods/865290dc-6b08-4e2a-b781-cb28a8dd57a0/volumes" Apr 25 00:25:52.929475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:52.929446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerStarted","Data":"4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4"} Apr 25 00:25:52.929816 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:52.929484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerStarted","Data":"4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db"} Apr 25 00:25:52.929816 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:52.929690 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:52.969125 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:52.969074 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podStartSLOduration=6.969059671 podStartE2EDuration="6.969059671s" podCreationTimestamp="2026-04-25 00:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:25:52.966860834 +0000 UTC m=+1913.247977755" watchObservedRunningTime="2026-04-25 00:25:52.969059671 +0000 UTC m=+1913.250176591" Apr 25 00:25:53.932372 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:53.932339 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:53.933410 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:53.933385 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:25:54.934877 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:54.934832 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:25:59.939083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:59.939054 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:25:59.939675 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:25:59.939642 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:26:09.940216 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:26:09.940179 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:26:19.940459 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:26:19.940415 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:26:29.939771 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:26:29.939733 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:26:39.939949 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:26:39.939843 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:26:49.940028 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:26:49.939991 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 25 00:26:59.940880 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:26:59.940851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:27:06.676206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676153 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4"] Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676453 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="storage-initializer" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676464 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="storage-initializer" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676473 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676479 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676496 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kube-rbac-proxy" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676504 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kube-rbac-proxy" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676553 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kube-rbac-proxy" Apr 25 00:27:06.676577 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.676560 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="865290dc-6b08-4e2a-b781-cb28a8dd57a0" containerName="kserve-container" Apr 25 00:27:06.679813 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.679781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.682181 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.682160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-42458c-dockercfg-8xjc4\"" Apr 25 00:27:06.682332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.682222 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-42458c-predictor-serving-cert\"" Apr 25 00:27:06.682332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.682293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-42458c\"" Apr 25 00:27:06.682472 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.682327 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-42458c-kube-rbac-proxy-sar-config\"" Apr 25 00:27:06.682532 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.682473 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 25 00:27:06.690514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.690491 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4"] Apr 25 00:27:06.786788 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.786755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.786969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.786797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-isvc-secondary-42458c-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.786969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.786829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xf79\" (UniqueName: \"kubernetes.io/projected/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kube-api-access-7xf79\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.786969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.786857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-cabundle-cert\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.786969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.786891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kserve-provision-location\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.887996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.887968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kserve-provision-location\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888127 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888127 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-isvc-secondary-42458c-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888249 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:06.888124 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-42458c-predictor-serving-cert: secret "isvc-secondary-42458c-predictor-serving-cert" not found Apr 25 00:27:06.888249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xf79\" (UniqueName: \"kubernetes.io/projected/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kube-api-access-7xf79\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888249 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:06.888205 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls podName:4c648323-9cf4-4ff4-ae5a-85687a1af0aa nodeName:}" failed. No retries permitted until 2026-04-25 00:27:07.388183407 +0000 UTC m=+1987.669300305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls") pod "isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" (UID: "4c648323-9cf4-4ff4-ae5a-85687a1af0aa") : secret "isvc-secondary-42458c-predictor-serving-cert" not found Apr 25 00:27:06.888249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-cabundle-cert\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kserve-provision-location\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888677 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-isvc-secondary-42458c-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.888734 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.888718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-cabundle-cert\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:06.896639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:06.896621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xf79\" (UniqueName: \"kubernetes.io/projected/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kube-api-access-7xf79\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:07.391954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:07.391894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:07.394449 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:07.394428 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls\") pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:07.591216 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:07.591189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:07.708608 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:07.708575 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4"] Apr 25 00:27:07.711697 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:27:07.711666 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c648323_9cf4_4ff4_ae5a_85687a1af0aa.slice/crio-90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601 WatchSource:0}: Error finding container 90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601: Status 404 returned error can't find the container with id 90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601 Apr 25 00:27:07.713551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:07.713535 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:27:08.139482 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:08.139449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" event={"ID":"4c648323-9cf4-4ff4-ae5a-85687a1af0aa","Type":"ContainerStarted","Data":"df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e"} Apr 25 00:27:08.139482 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:08.139483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" event={"ID":"4c648323-9cf4-4ff4-ae5a-85687a1af0aa","Type":"ContainerStarted","Data":"90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601"} Apr 25 00:27:13.162555 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:13.162525 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/0.log" Apr 25 00:27:13.162943 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:13.162569 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerID="df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e" exitCode=1 Apr 25 00:27:13.162943 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:13.162651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" event={"ID":"4c648323-9cf4-4ff4-ae5a-85687a1af0aa","Type":"ContainerDied","Data":"df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e"} Apr 25 00:27:14.167517 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:14.167489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/0.log" Apr 25 00:27:14.167949 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:14.167581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" event={"ID":"4c648323-9cf4-4ff4-ae5a-85687a1af0aa","Type":"ContainerStarted","Data":"02791329d1c49544961753018314d882b692a81554ebe8b29911b46dc0c1303e"} Apr 25 00:27:17.176866 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:17.176791 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/1.log" Apr 25 00:27:17.177302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:17.177136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/0.log" Apr 25 00:27:17.177302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:17.177165 2576 generic.go:358] "Generic (PLEG): container finished" podID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerID="02791329d1c49544961753018314d882b692a81554ebe8b29911b46dc0c1303e" exitCode=1 Apr 25 00:27:17.177302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:17.177234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" event={"ID":"4c648323-9cf4-4ff4-ae5a-85687a1af0aa","Type":"ContainerDied","Data":"02791329d1c49544961753018314d882b692a81554ebe8b29911b46dc0c1303e"} Apr 25 00:27:17.177302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:17.177265 2576 scope.go:117] "RemoveContainer" containerID="df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e" Apr 25 00:27:17.177665 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:17.177647 2576 scope.go:117] "RemoveContainer" containerID="df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e" Apr 25 00:27:17.187275 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:17.187238 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_kserve-ci-e2e-test_4c648323-9cf4-4ff4-ae5a-85687a1af0aa_0 in pod sandbox 90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601 from index: no such id: 'df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e'" containerID="df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e" Apr 25 00:27:17.187349 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:17.187300 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_kserve-ci-e2e-test_4c648323-9cf4-4ff4-ae5a-85687a1af0aa_0 in pod sandbox 90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601 from index: no such id: 'df6d4a26700510b0783e300407e203c9c549b3a5f5ee209f489a32ee180e1b2e'; Skipping pod \"isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_kserve-ci-e2e-test(4c648323-9cf4-4ff4-ae5a-85687a1af0aa)\"" logger="UnhandledError" Apr 25 00:27:17.188602 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:17.188580 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_kserve-ci-e2e-test(4c648323-9cf4-4ff4-ae5a-85687a1af0aa)\"" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" Apr 25 00:27:18.181538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:18.181508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/1.log" Apr 25 00:27:24.730010 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.729978 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4"] Apr 25 00:27:24.794329 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.794297 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7"] Apr 25 00:27:24.794778 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.794726 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" containerID="cri-o://4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db" gracePeriod=30 Apr 25 00:27:24.794856 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.794751 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kube-rbac-proxy" containerID="cri-o://4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4" gracePeriod=30 Apr 25 00:27:24.865167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.865137 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn"] Apr 25 00:27:24.868534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.868515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:24.871071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.871045 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-ac23ba-predictor-serving-cert\"" Apr 25 00:27:24.871071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.871054 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-ac23ba-dockercfg-bwbj6\"" Apr 25 00:27:24.871259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.871054 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\"" Apr 25 00:27:24.871259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.871054 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-ac23ba\"" Apr 25 00:27:24.877562 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.877538 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn"] Apr 25 00:27:24.913644 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.913622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/1.log" Apr 25 00:27:24.913733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.913688 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:24.935953 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:24.935898 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.38:8643/healthz\": dial tcp 10.134.0.38:8643: connect: connection refused" Apr 25 00:27:25.032129 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032099 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-cabundle-cert\") pod \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " Apr 25 00:27:25.032295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032166 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kserve-provision-location\") pod \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " Apr 25 00:27:25.032295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-isvc-secondary-42458c-kube-rbac-proxy-sar-config\") pod \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " Apr 25 00:27:25.032295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032214 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls\") pod \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " Apr 25 00:27:25.032295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032237 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xf79\" (UniqueName: \"kubernetes.io/projected/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kube-api-access-7xf79\") pod \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\" (UID: \"4c648323-9cf4-4ff4-ae5a-85687a1af0aa\") " Apr 25 00:27:25.032501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-proxy-tls\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.032501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kserve-provision-location\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.032606 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032541 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4c648323-9cf4-4ff4-ae5a-85687a1af0aa" (UID: "4c648323-9cf4-4ff4-ae5a-85687a1af0aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:27:25.032606 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032564 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4c648323-9cf4-4ff4-ae5a-85687a1af0aa" (UID: "4c648323-9cf4-4ff4-ae5a-85687a1af0aa"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:27:25.032606 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032575 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-isvc-secondary-42458c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-42458c-kube-rbac-proxy-sar-config") pod "4c648323-9cf4-4ff4-ae5a-85687a1af0aa" (UID: "4c648323-9cf4-4ff4-ae5a-85687a1af0aa"). InnerVolumeSpecName "isvc-secondary-42458c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:27:25.032739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.032739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7n4g\" (UniqueName: \"kubernetes.io/projected/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kube-api-access-k7n4g\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.032739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-cabundle-cert\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.032857 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032764 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-cabundle-cert\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:25.032857 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032776 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:25.032857 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.032787 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-isvc-secondary-42458c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:25.034506 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.034485 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kube-api-access-7xf79" (OuterVolumeSpecName: "kube-api-access-7xf79") pod "4c648323-9cf4-4ff4-ae5a-85687a1af0aa" (UID: "4c648323-9cf4-4ff4-ae5a-85687a1af0aa"). InnerVolumeSpecName "kube-api-access-7xf79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:27:25.034602 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.034587 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4c648323-9cf4-4ff4-ae5a-85687a1af0aa" (UID: "4c648323-9cf4-4ff4-ae5a-85687a1af0aa"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:27:25.133297 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kserve-provision-location\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.133470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.133470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7n4g\" (UniqueName: \"kubernetes.io/projected/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kube-api-access-k7n4g\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.133470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-cabundle-cert\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.133470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-proxy-tls\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.133683 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133471 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:25.133683 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133486 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xf79\" (UniqueName: \"kubernetes.io/projected/4c648323-9cf4-4ff4-ae5a-85687a1af0aa-kube-api-access-7xf79\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:25.133788 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kserve-provision-location\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.134018 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.133998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.134101 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.134007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-cabundle-cert\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.136036 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.136016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-proxy-tls\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.141526 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.141508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7n4g\" (UniqueName: \"kubernetes.io/projected/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kube-api-access-k7n4g\") pod \"isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.179655 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.179625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:25.207667 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.207634 2576 generic.go:358] "Generic (PLEG): container finished" podID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerID="4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4" exitCode=2 Apr 25 00:27:25.207808 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.207707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerDied","Data":"4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4"} Apr 25 00:27:25.208806 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.208788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-42458c-predictor-68f6f7f769-4kjd4_4c648323-9cf4-4ff4-ae5a-85687a1af0aa/storage-initializer/1.log" Apr 25 00:27:25.208939 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.208844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" event={"ID":"4c648323-9cf4-4ff4-ae5a-85687a1af0aa","Type":"ContainerDied","Data":"90f1d04357dac0d66a54416da6687a99e32181d37b498660ad973ad681b2c601"} Apr 25 00:27:25.208939 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.208870 2576 scope.go:117] "RemoveContainer" containerID="02791329d1c49544961753018314d882b692a81554ebe8b29911b46dc0c1303e" Apr 25 00:27:25.209060 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.208941 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4" Apr 25 00:27:25.245168 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.245115 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4"] Apr 25 00:27:25.250111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.250081 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-42458c-predictor-68f6f7f769-4kjd4"] Apr 25 00:27:25.303618 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:25.303594 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn"] Apr 25 00:27:25.305968 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:27:25.305946 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0803c672_7bd5_4bca_a9ec_b02e0055c3f2.slice/crio-cd29e7dd59b742cfc97b0ca9ee070554b68167773300d00928d3c1ed309cd80b WatchSource:0}: Error finding container cd29e7dd59b742cfc97b0ca9ee070554b68167773300d00928d3c1ed309cd80b: Status 404 returned error can't find the container with id cd29e7dd59b742cfc97b0ca9ee070554b68167773300d00928d3c1ed309cd80b Apr 25 00:27:26.212653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:26.212610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" event={"ID":"0803c672-7bd5-4bca-a9ec-b02e0055c3f2","Type":"ContainerStarted","Data":"f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395"} Apr 25 00:27:26.212653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:26.212651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" event={"ID":"0803c672-7bd5-4bca-a9ec-b02e0055c3f2","Type":"ContainerStarted","Data":"cd29e7dd59b742cfc97b0ca9ee070554b68167773300d00928d3c1ed309cd80b"} Apr 25 00:27:26.320585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:26.320541 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" path="/var/lib/kubelet/pods/4c648323-9cf4-4ff4-ae5a-85687a1af0aa/volumes" Apr 25 00:27:28.931121 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:28.931098 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:27:29.062946 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.062821 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls\") pod \"734f07a4-1148-403e-b849-b60ad7ec15f8\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " Apr 25 00:27:29.062946 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.062886 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/734f07a4-1148-403e-b849-b60ad7ec15f8-kserve-provision-location\") pod \"734f07a4-1148-403e-b849-b60ad7ec15f8\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " Apr 25 00:27:29.063183 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.062945 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/734f07a4-1148-403e-b849-b60ad7ec15f8-isvc-primary-42458c-kube-rbac-proxy-sar-config\") pod \"734f07a4-1148-403e-b849-b60ad7ec15f8\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " Apr 25 00:27:29.063183 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.062990 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdp4v\" (UniqueName: \"kubernetes.io/projected/734f07a4-1148-403e-b849-b60ad7ec15f8-kube-api-access-mdp4v\") pod \"734f07a4-1148-403e-b849-b60ad7ec15f8\" (UID: \"734f07a4-1148-403e-b849-b60ad7ec15f8\") " Apr 25 00:27:29.063292 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.063263 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/734f07a4-1148-403e-b849-b60ad7ec15f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "734f07a4-1148-403e-b849-b60ad7ec15f8" (UID: "734f07a4-1148-403e-b849-b60ad7ec15f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:27:29.063327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.063289 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734f07a4-1148-403e-b849-b60ad7ec15f8-isvc-primary-42458c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-42458c-kube-rbac-proxy-sar-config") pod "734f07a4-1148-403e-b849-b60ad7ec15f8" (UID: "734f07a4-1148-403e-b849-b60ad7ec15f8"). InnerVolumeSpecName "isvc-primary-42458c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:27:29.065249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.065215 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734f07a4-1148-403e-b849-b60ad7ec15f8-kube-api-access-mdp4v" (OuterVolumeSpecName: "kube-api-access-mdp4v") pod "734f07a4-1148-403e-b849-b60ad7ec15f8" (UID: "734f07a4-1148-403e-b849-b60ad7ec15f8"). InnerVolumeSpecName "kube-api-access-mdp4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:27:29.065336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.065270 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "734f07a4-1148-403e-b849-b60ad7ec15f8" (UID: "734f07a4-1148-403e-b849-b60ad7ec15f8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:27:29.164155 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.164111 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/734f07a4-1148-403e-b849-b60ad7ec15f8-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:29.164155 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.164148 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/734f07a4-1148-403e-b849-b60ad7ec15f8-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:29.164155 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.164163 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-42458c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/734f07a4-1148-403e-b849-b60ad7ec15f8-isvc-primary-42458c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:29.164396 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.164178 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdp4v\" (UniqueName: \"kubernetes.io/projected/734f07a4-1148-403e-b849-b60ad7ec15f8-kube-api-access-mdp4v\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:29.223504 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.223480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn_0803c672-7bd5-4bca-a9ec-b02e0055c3f2/storage-initializer/0.log" Apr 25 00:27:29.223606 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.223521 2576 generic.go:358] "Generic (PLEG): container finished" podID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerID="f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395" exitCode=1 Apr 25 00:27:29.223653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.223604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" event={"ID":"0803c672-7bd5-4bca-a9ec-b02e0055c3f2","Type":"ContainerDied","Data":"f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395"} Apr 25 00:27:29.225185 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.225150 2576 generic.go:358] "Generic (PLEG): container finished" podID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerID="4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db" exitCode=0 Apr 25 00:27:29.225302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.225238 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" Apr 25 00:27:29.225407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.225240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerDied","Data":"4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db"} Apr 25 00:27:29.225407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.225354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7" event={"ID":"734f07a4-1148-403e-b849-b60ad7ec15f8","Type":"ContainerDied","Data":"2e9d33dc1b72ade0b475fcc2599b981ae723da4e1e5c78a3ffa38205fe245bca"} Apr 25 00:27:29.225407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.225384 2576 scope.go:117] "RemoveContainer" containerID="4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4" Apr 25 00:27:29.253473 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.253455 2576 scope.go:117] "RemoveContainer" containerID="4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db" Apr 25 00:27:29.272573 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.272532 2576 scope.go:117] "RemoveContainer" containerID="60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931" Apr 25 00:27:29.285952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.285896 2576 scope.go:117] "RemoveContainer" containerID="4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4" Apr 25 00:27:29.286281 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:29.286260 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4\": container with ID starting with 4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4 not found: ID does not exist" containerID="4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4" Apr 25 00:27:29.286366 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.286294 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4"} err="failed to get container status \"4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4\": rpc error: code = NotFound desc = could not find container \"4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4\": container with ID starting with 4fbe4235e5b8826dbf5484d671d8cbf79c43a48faf0a6bafb99199a45f82bac4 not found: ID does not exist" Apr 25 00:27:29.286366 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.286320 2576 scope.go:117] "RemoveContainer" containerID="4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db" Apr 25 00:27:29.286579 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.286545 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7"] Apr 25 00:27:29.286644 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:29.286588 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db\": container with ID starting with 4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db not found: ID does not exist" containerID="4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db" Apr 25 00:27:29.286644 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.286612 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db"} err="failed to get container status \"4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db\": rpc error: code = NotFound desc = could not find container \"4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db\": container with ID starting with 4b0e205458f52ff410ac204b01a4f58cdaac1744a0bd6408e9685f3f086527db not found: ID does not exist" Apr 25 00:27:29.286644 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.286634 2576 scope.go:117] "RemoveContainer" containerID="60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931" Apr 25 00:27:29.286943 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:29.286899 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931\": container with ID starting with 60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931 not found: ID does not exist" containerID="60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931" Apr 25 00:27:29.286997 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.286953 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931"} err="failed to get container status \"60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931\": rpc error: code = NotFound desc = could not find container \"60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931\": container with ID starting with 60137d23113642574c88e825006dd1969b677b31ea0485c247a79ef39cd25931 not found: ID does not exist" Apr 25 00:27:29.291834 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.291812 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-42458c-predictor-c86f7785d-rxtt7"] Apr 25 00:27:29.855045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.855009 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn"] Apr 25 00:27:29.977826 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.977784 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc"] Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978227 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978246 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978265 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="storage-initializer" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978273 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="storage-initializer" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978280 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kube-rbac-proxy" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978288 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kube-rbac-proxy" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978302 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerName="storage-initializer" Apr 25 00:27:29.978354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978309 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerName="storage-initializer" Apr 25 00:27:29.978759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978385 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerName="storage-initializer" Apr 25 00:27:29.978759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978397 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kube-rbac-proxy" Apr 25 00:27:29.978759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978408 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" containerName="kserve-container" Apr 25 00:27:29.978759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978476 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerName="storage-initializer" Apr 25 00:27:29.978759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978486 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerName="storage-initializer" Apr 25 00:27:29.978759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.978580 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c648323-9cf4-4ff4-ae5a-85687a1af0aa" containerName="storage-initializer" Apr 25 00:27:29.981699 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.981677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:29.985393 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.985368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 25 00:27:29.985658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.985630 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kz9zk\"" Apr 25 00:27:29.986174 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.986158 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 25 00:27:29.994561 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:29.994539 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc"] Apr 25 00:27:30.070597 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.070560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/46ae2373-92e9-4f0b-9ed5-6607ea44000b-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.070766 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.070625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wmn\" (UniqueName: \"kubernetes.io/projected/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kube-api-access-x4wmn\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.070766 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.070704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.070766 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.070742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.172144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.172027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wmn\" (UniqueName: \"kubernetes.io/projected/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kube-api-access-x4wmn\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.172144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.172111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.172144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.172145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.172425 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.172182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/46ae2373-92e9-4f0b-9ed5-6607ea44000b-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.172425 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:30.172270 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 25 00:27:30.172425 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:30.172331 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls podName:46ae2373-92e9-4f0b-9ed5-6607ea44000b nodeName:}" failed. No retries permitted until 2026-04-25 00:27:30.672314641 +0000 UTC m=+2010.953431539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" (UID: "46ae2373-92e9-4f0b-9ed5-6607ea44000b") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 25 00:27:30.172584 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.172482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.172849 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.172821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/46ae2373-92e9-4f0b-9ed5-6607ea44000b-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.181190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.181159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wmn\" (UniqueName: \"kubernetes.io/projected/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kube-api-access-x4wmn\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.230811 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.230785 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn_0803c672-7bd5-4bca-a9ec-b02e0055c3f2/storage-initializer/0.log" Apr 25 00:27:30.231012 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.230890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" event={"ID":"0803c672-7bd5-4bca-a9ec-b02e0055c3f2","Type":"ContainerStarted","Data":"d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111"} Apr 25 00:27:30.231080 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.231030 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" containerID="cri-o://d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111" gracePeriod=30 Apr 25 00:27:30.316549 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.316517 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734f07a4-1148-403e-b849-b60ad7ec15f8" path="/var/lib/kubelet/pods/734f07a4-1148-403e-b849-b60ad7ec15f8/volumes" Apr 25 00:27:30.678080 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.678045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.683361 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.683329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:30.892282 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:30.892239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:31.010247 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.010210 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc"] Apr 25 00:27:31.013354 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:27:31.013329 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46ae2373_92e9_4f0b_9ed5_6607ea44000b.slice/crio-68edda5c8beb456f4530d47d0668c97756f34444ce057a81b550a1f44b9cb3ac WatchSource:0}: Error finding container 68edda5c8beb456f4530d47d0668c97756f34444ce057a81b550a1f44b9cb3ac: Status 404 returned error can't find the container with id 68edda5c8beb456f4530d47d0668c97756f34444ce057a81b550a1f44b9cb3ac Apr 25 00:27:31.236054 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.235974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerStarted","Data":"32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf"} Apr 25 00:27:31.236054 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.236007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerStarted","Data":"68edda5c8beb456f4530d47d0668c97756f34444ce057a81b550a1f44b9cb3ac"} Apr 25 00:27:31.667113 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.667091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn_0803c672-7bd5-4bca-a9ec-b02e0055c3f2/storage-initializer/1.log" Apr 25 00:27:31.667429 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.667411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn_0803c672-7bd5-4bca-a9ec-b02e0055c3f2/storage-initializer/0.log" Apr 25 00:27:31.667506 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.667494 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:31.789092 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789059 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7n4g\" (UniqueName: \"kubernetes.io/projected/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kube-api-access-k7n4g\") pod \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " Apr 25 00:27:31.789281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789119 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-proxy-tls\") pod \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " Apr 25 00:27:31.789281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789152 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kserve-provision-location\") pod \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " Apr 25 00:27:31.789281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789205 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\") pod \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " Apr 25 00:27:31.789281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789235 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-cabundle-cert\") pod \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\" (UID: \"0803c672-7bd5-4bca-a9ec-b02e0055c3f2\") " Apr 25 00:27:31.789504 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789483 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0803c672-7bd5-4bca-a9ec-b02e0055c3f2" (UID: "0803c672-7bd5-4bca-a9ec-b02e0055c3f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:27:31.789612 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789585 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config") pod "0803c672-7bd5-4bca-a9ec-b02e0055c3f2" (UID: "0803c672-7bd5-4bca-a9ec-b02e0055c3f2"). InnerVolumeSpecName "isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:27:31.789692 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.789621 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "0803c672-7bd5-4bca-a9ec-b02e0055c3f2" (UID: "0803c672-7bd5-4bca-a9ec-b02e0055c3f2"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:27:31.791367 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.791345 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kube-api-access-k7n4g" (OuterVolumeSpecName: "kube-api-access-k7n4g") pod "0803c672-7bd5-4bca-a9ec-b02e0055c3f2" (UID: "0803c672-7bd5-4bca-a9ec-b02e0055c3f2"). InnerVolumeSpecName "kube-api-access-k7n4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:27:31.791456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.791402 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0803c672-7bd5-4bca-a9ec-b02e0055c3f2" (UID: "0803c672-7bd5-4bca-a9ec-b02e0055c3f2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:27:31.889954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.889875 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-cabundle-cert\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:31.889954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.889895 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7n4g\" (UniqueName: \"kubernetes.io/projected/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kube-api-access-k7n4g\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:31.889954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.889906 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:31.889954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.889931 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:31.889954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:31.889941 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0803c672-7bd5-4bca-a9ec-b02e0055c3f2-isvc-init-fail-ac23ba-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:27:32.240162 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn_0803c672-7bd5-4bca-a9ec-b02e0055c3f2/storage-initializer/1.log" Apr 25 00:27:32.240527 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240437 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn_0803c672-7bd5-4bca-a9ec-b02e0055c3f2/storage-initializer/0.log" Apr 25 00:27:32.240527 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240474 2576 generic.go:358] "Generic (PLEG): container finished" podID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerID="d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111" exitCode=1 Apr 25 00:27:32.240527 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" event={"ID":"0803c672-7bd5-4bca-a9ec-b02e0055c3f2","Type":"ContainerDied","Data":"d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111"} Apr 25 00:27:32.240697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" event={"ID":"0803c672-7bd5-4bca-a9ec-b02e0055c3f2","Type":"ContainerDied","Data":"cd29e7dd59b742cfc97b0ca9ee070554b68167773300d00928d3c1ed309cd80b"} Apr 25 00:27:32.240697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn" Apr 25 00:27:32.240697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.240578 2576 scope.go:117] "RemoveContainer" containerID="d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111" Apr 25 00:27:32.249628 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.249441 2576 scope.go:117] "RemoveContainer" containerID="f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395" Apr 25 00:27:32.256310 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.256289 2576 scope.go:117] "RemoveContainer" containerID="d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111" Apr 25 00:27:32.256574 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:32.256555 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111\": container with ID starting with d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111 not found: ID does not exist" containerID="d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111" Apr 25 00:27:32.256629 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.256583 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111"} err="failed to get container status \"d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111\": rpc error: code = NotFound desc = could not find container \"d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111\": container with ID starting with d474782e05862b09194c18ca26650c911597f2d193c955ab3c1d1a88a3b8f111 not found: ID does not exist" Apr 25 00:27:32.256629 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.256602 2576 scope.go:117] "RemoveContainer" containerID="f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395" Apr 25 00:27:32.256840 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:27:32.256826 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395\": container with ID starting with f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395 not found: ID does not exist" containerID="f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395" Apr 25 00:27:32.256889 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.256843 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395"} err="failed to get container status \"f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395\": rpc error: code = NotFound desc = could not find container \"f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395\": container with ID starting with f372e2454b9b2565b90dec028faa94ea7192cc4b89c7e106ebc2730c6f3a0395 not found: ID does not exist" Apr 25 00:27:32.276689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.276662 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn"] Apr 25 00:27:32.280101 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.280075 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-ac23ba-predictor-89ccf4585-gmffn"] Apr 25 00:27:32.315963 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:32.315935 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" path="/var/lib/kubelet/pods/0803c672-7bd5-4bca-a9ec-b02e0055c3f2/volumes" Apr 25 00:27:35.251818 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:35.251721 2576 generic.go:358] "Generic (PLEG): container finished" podID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerID="32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf" exitCode=0 Apr 25 00:27:35.251818 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:35.251796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerDied","Data":"32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf"} Apr 25 00:27:55.314085 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:55.314046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerStarted","Data":"2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89"} Apr 25 00:27:55.314085 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:55.314088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerStarted","Data":"803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e"} Apr 25 00:27:55.314622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:55.314297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:55.332180 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:55.332132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podStartSLOduration=7.209317268 podStartE2EDuration="26.33212041s" podCreationTimestamp="2026-04-25 00:27:29 +0000 UTC" firstStartedPulling="2026-04-25 00:27:35.253091189 +0000 UTC m=+2015.534208087" lastFinishedPulling="2026-04-25 00:27:54.375894331 +0000 UTC m=+2034.657011229" observedRunningTime="2026-04-25 00:27:55.33096959 +0000 UTC m=+2035.612086507" watchObservedRunningTime="2026-04-25 00:27:55.33212041 +0000 UTC m=+2035.613237329" Apr 25 00:27:56.316461 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:56.316428 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:27:56.317551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:56.317521 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:27:57.319555 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:27:57.319512 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:28:02.323478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:02.323451 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:28:02.323969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:02.323886 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:28:12.324262 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:12.324226 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:28:22.324669 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:22.324633 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:28:32.324001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:32.323965 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:28:42.324168 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:42.324133 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:28:52.324245 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:28:52.324208 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:29:02.324637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:02.324593 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:29:11.312658 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:11.312632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:29:20.118190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.118150 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc"] Apr 25 00:29:20.118751 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.118698 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" containerID="cri-o://803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e" gracePeriod=30 Apr 25 00:29:20.119344 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.118897 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kube-rbac-proxy" containerID="cri-o://2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89" gracePeriod=30 Apr 25 00:29:20.226018 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.225980 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4"] Apr 25 00:29:20.226357 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.226340 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" Apr 25 00:29:20.226448 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.226361 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" Apr 25 00:29:20.226448 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.226382 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" Apr 25 00:29:20.226448 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.226390 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" Apr 25 00:29:20.226604 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.226459 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" Apr 25 00:29:20.226604 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.226594 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0803c672-7bd5-4bca-a9ec-b02e0055c3f2" containerName="storage-initializer" Apr 25 00:29:20.229565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.229542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.232583 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.232561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 25 00:29:20.232697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.232642 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 25 00:29:20.242641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.242618 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4"] Apr 25 00:29:20.299228 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.299201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.299353 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.299244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2l7\" (UniqueName: \"kubernetes.io/projected/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kube-api-access-xn2l7\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.299353 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.299284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edddf8ee-a85a-4393-9230-fe4ac63aeda2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.299442 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.299361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edddf8ee-a85a-4393-9230-fe4ac63aeda2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.400538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.400464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.400538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.400504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2l7\" (UniqueName: \"kubernetes.io/projected/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kube-api-access-xn2l7\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.400709 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.400629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edddf8ee-a85a-4393-9230-fe4ac63aeda2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.400709 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.400692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edddf8ee-a85a-4393-9230-fe4ac63aeda2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.400884 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.400861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.401266 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.401247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edddf8ee-a85a-4393-9230-fe4ac63aeda2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.403275 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.403251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edddf8ee-a85a-4393-9230-fe4ac63aeda2-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.408554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.408532 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2l7\" (UniqueName: \"kubernetes.io/projected/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kube-api-access-xn2l7\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.540607 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.540557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:20.554153 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.554125 2576 generic.go:358] "Generic (PLEG): container finished" podID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerID="2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89" exitCode=2 Apr 25 00:29:20.554280 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.554201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerDied","Data":"2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89"} Apr 25 00:29:20.659581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:20.659547 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4"] Apr 25 00:29:20.661680 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:29:20.661654 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedddf8ee_a85a_4393_9230_fe4ac63aeda2.slice/crio-6e2d2b9917a52db34aae11f52091813e624946880a7f967f9db0cec346483146 WatchSource:0}: Error finding container 6e2d2b9917a52db34aae11f52091813e624946880a7f967f9db0cec346483146: Status 404 returned error can't find the container with id 6e2d2b9917a52db34aae11f52091813e624946880a7f967f9db0cec346483146 Apr 25 00:29:21.312117 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:21.312065 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 25 00:29:21.558067 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:21.558020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerStarted","Data":"48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8"} Apr 25 00:29:21.558067 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:21.558070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerStarted","Data":"6e2d2b9917a52db34aae11f52091813e624946880a7f967f9db0cec346483146"} Apr 25 00:29:22.320253 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:22.320219 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 25 00:29:24.562290 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.562267 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:29:24.567877 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.567843 2576 generic.go:358] "Generic (PLEG): container finished" podID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerID="803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e" exitCode=0 Apr 25 00:29:24.568040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.567885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerDied","Data":"803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e"} Apr 25 00:29:24.568040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.567933 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" Apr 25 00:29:24.568040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.567945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc" event={"ID":"46ae2373-92e9-4f0b-9ed5-6607ea44000b","Type":"ContainerDied","Data":"68edda5c8beb456f4530d47d0668c97756f34444ce057a81b550a1f44b9cb3ac"} Apr 25 00:29:24.568040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.567962 2576 scope.go:117] "RemoveContainer" containerID="2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89" Apr 25 00:29:24.575906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.575888 2576 scope.go:117] "RemoveContainer" containerID="803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e" Apr 25 00:29:24.584720 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.584702 2576 scope.go:117] "RemoveContainer" containerID="32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf" Apr 25 00:29:24.591786 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.591764 2576 scope.go:117] "RemoveContainer" containerID="2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89" Apr 25 00:29:24.592076 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:29:24.592052 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89\": container with ID starting with 2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89 not found: ID does not exist" containerID="2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89" Apr 25 00:29:24.592145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.592092 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89"} err="failed to get container status \"2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89\": rpc error: code = NotFound desc = could not find container \"2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89\": container with ID starting with 2e918a2ac0788686121104c9bfe6a9e2596e42cc0338839832787c5d81d6af89 not found: ID does not exist" Apr 25 00:29:24.592145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.592117 2576 scope.go:117] "RemoveContainer" containerID="803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e" Apr 25 00:29:24.592416 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:29:24.592398 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e\": container with ID starting with 803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e not found: ID does not exist" containerID="803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e" Apr 25 00:29:24.592460 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.592423 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e"} err="failed to get container status \"803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e\": rpc error: code = NotFound desc = could not find container \"803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e\": container with ID starting with 803e5f3d64c291e86b2323e6c85b0a1c976234c28a7dc5bbca133c73e8edb39e not found: ID does not exist" Apr 25 00:29:24.592460 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.592440 2576 scope.go:117] "RemoveContainer" containerID="32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf" Apr 25 00:29:24.592685 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:29:24.592668 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf\": container with ID starting with 32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf not found: ID does not exist" containerID="32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf" Apr 25 00:29:24.592740 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.592691 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf"} err="failed to get container status \"32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf\": rpc error: code = NotFound desc = could not find container \"32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf\": container with ID starting with 32e8adb25be9d447374c5ca268bf12be5501cf69b6a4389b7a6c03a6d68a6dbf not found: ID does not exist" Apr 25 00:29:24.631538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.631513 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4wmn\" (UniqueName: \"kubernetes.io/projected/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kube-api-access-x4wmn\") pod \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " Apr 25 00:29:24.631623 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.631554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kserve-provision-location\") pod \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " Apr 25 00:29:24.631623 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.631602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls\") pod \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " Apr 25 00:29:24.631696 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.631636 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/46ae2373-92e9-4f0b-9ed5-6607ea44000b-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\" (UID: \"46ae2373-92e9-4f0b-9ed5-6607ea44000b\") " Apr 25 00:29:24.631957 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.631932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "46ae2373-92e9-4f0b-9ed5-6607ea44000b" (UID: "46ae2373-92e9-4f0b-9ed5-6607ea44000b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:29:24.632020 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.632001 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46ae2373-92e9-4f0b-9ed5-6607ea44000b-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "46ae2373-92e9-4f0b-9ed5-6607ea44000b" (UID: "46ae2373-92e9-4f0b-9ed5-6607ea44000b"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:29:24.633651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.633623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kube-api-access-x4wmn" (OuterVolumeSpecName: "kube-api-access-x4wmn") pod "46ae2373-92e9-4f0b-9ed5-6607ea44000b" (UID: "46ae2373-92e9-4f0b-9ed5-6607ea44000b"). InnerVolumeSpecName "kube-api-access-x4wmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:29:24.633739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.633696 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "46ae2373-92e9-4f0b-9ed5-6607ea44000b" (UID: "46ae2373-92e9-4f0b-9ed5-6607ea44000b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:29:24.732524 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.732495 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4wmn\" (UniqueName: \"kubernetes.io/projected/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kube-api-access-x4wmn\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:29:24.732524 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.732520 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46ae2373-92e9-4f0b-9ed5-6607ea44000b-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:29:24.732680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.732532 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/46ae2373-92e9-4f0b-9ed5-6607ea44000b-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:29:24.732680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.732542 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/46ae2373-92e9-4f0b-9ed5-6607ea44000b-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:29:24.894472 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.894439 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc"] Apr 25 00:29:24.904536 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:24.904511 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-sl8rc"] Apr 25 00:29:25.573111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:25.573075 2576 generic.go:358] "Generic (PLEG): container finished" podID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerID="48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8" exitCode=0 Apr 25 00:29:25.573546 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:25.573138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerDied","Data":"48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8"} Apr 25 00:29:26.315099 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:26.315062 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" path="/var/lib/kubelet/pods/46ae2373-92e9-4f0b-9ed5-6607ea44000b/volumes" Apr 25 00:29:26.577279 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:26.577202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerStarted","Data":"35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff"} Apr 25 00:29:26.577279 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:26.577242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerStarted","Data":"e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392"} Apr 25 00:29:26.577653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:26.577556 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:26.595597 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:26.595546 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podStartSLOduration=6.595530489 podStartE2EDuration="6.595530489s" podCreationTimestamp="2026-04-25 00:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:29:26.593221625 +0000 UTC m=+2126.874338544" watchObservedRunningTime="2026-04-25 00:29:26.595530489 +0000 UTC m=+2126.876647411" Apr 25 00:29:27.580000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:27.579973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:27.581399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:27.581367 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:29:28.584367 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:28.584328 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:29:33.588765 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:33.588730 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:29:33.589323 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:33.589293 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:29:37.092195 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:37.092167 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:29:37.094472 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:37.094449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:29:43.589668 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:43.589621 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:29:53.589395 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:29:53.589350 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:30:03.590258 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:03.590215 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:30:13.590150 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:13.590113 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:30:23.589307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:23.589258 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:30:33.589413 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:33.589366 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:30:43.589975 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:43.589945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:30:50.327581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.327545 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4"] Apr 25 00:30:50.328014 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.327855 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" containerID="cri-o://e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392" gracePeriod=30 Apr 25 00:30:50.328014 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.327906 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kube-rbac-proxy" containerID="cri-o://35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff" gracePeriod=30 Apr 25 00:30:50.429235 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429209 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl"] Apr 25 00:30:50.429485 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429473 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" Apr 25 00:30:50.429542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429487 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" Apr 25 00:30:50.429542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429500 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kube-rbac-proxy" Apr 25 00:30:50.429622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429546 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kube-rbac-proxy" Apr 25 00:30:50.429622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429555 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="storage-initializer" Apr 25 00:30:50.429622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429560 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="storage-initializer" Apr 25 00:30:50.429622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429604 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kube-rbac-proxy" Apr 25 00:30:50.429622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.429615 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="46ae2373-92e9-4f0b-9ed5-6607ea44000b" containerName="kserve-container" Apr 25 00:30:50.432795 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.432776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.435098 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.435077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 25 00:30:50.435205 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.435082 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 25 00:30:50.444933 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.442490 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl"] Apr 25 00:30:50.571736 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.571693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.571736 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.571741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.572002 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.571779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.572002 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.571804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghmz9\" (UniqueName: \"kubernetes.io/projected/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kube-api-access-ghmz9\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.672358 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.672266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.672358 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.672304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.672358 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.672340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.672650 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.672366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghmz9\" (UniqueName: \"kubernetes.io/projected/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kube-api-access-ghmz9\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.672650 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:30:50.672460 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-serving-cert: secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 25 00:30:50.672650 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:30:50.672550 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls podName:1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56 nodeName:}" failed. No retries permitted until 2026-04-25 00:30:51.172529101 +0000 UTC m=+2211.453646000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls") pod "isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" (UID: "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56") : secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 25 00:30:50.672807 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.672716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.672979 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.672961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.683075 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.683054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghmz9\" (UniqueName: \"kubernetes.io/projected/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kube-api-access-ghmz9\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:50.815005 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.814970 2576 generic.go:358] "Generic (PLEG): container finished" podID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerID="35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff" exitCode=2 Apr 25 00:30:50.815171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:50.815046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerDied","Data":"35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff"} Apr 25 00:30:51.176443 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:51.176399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:51.179031 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:51.179010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:51.348534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:51.348505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:51.471370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:51.471344 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl"] Apr 25 00:30:51.471815 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:30:51.471789 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f88b6ea_6bbd_41a8_9d09_80d1eea0eb56.slice/crio-bf94e196b4e671da1621354ba847c947460d1ae7c5527f348ba67c5134a60998 WatchSource:0}: Error finding container bf94e196b4e671da1621354ba847c947460d1ae7c5527f348ba67c5134a60998: Status 404 returned error can't find the container with id bf94e196b4e671da1621354ba847c947460d1ae7c5527f348ba67c5134a60998 Apr 25 00:30:51.820021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:51.819981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerStarted","Data":"bc911f5aa30dce6d10eb5e7ea8077c599dddcfea4c14235facac4443e9b85a54"} Apr 25 00:30:51.820021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:51.820018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerStarted","Data":"bf94e196b4e671da1621354ba847c947460d1ae7c5527f348ba67c5134a60998"} Apr 25 00:30:53.585534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:53.585489 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 25 00:30:53.589829 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:53.589806 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 25 00:30:54.670500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.670479 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:30:54.700616 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.700582 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edddf8ee-a85a-4393-9230-fe4ac63aeda2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " Apr 25 00:30:54.700616 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.700632 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kserve-provision-location\") pod \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " Apr 25 00:30:54.700860 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.700731 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edddf8ee-a85a-4393-9230-fe4ac63aeda2-proxy-tls\") pod \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " Apr 25 00:30:54.700860 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.700761 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2l7\" (UniqueName: \"kubernetes.io/projected/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kube-api-access-xn2l7\") pod \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\" (UID: \"edddf8ee-a85a-4393-9230-fe4ac63aeda2\") " Apr 25 00:30:54.700999 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.700969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "edddf8ee-a85a-4393-9230-fe4ac63aeda2" (UID: "edddf8ee-a85a-4393-9230-fe4ac63aeda2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:30:54.700999 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.700983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edddf8ee-a85a-4393-9230-fe4ac63aeda2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "edddf8ee-a85a-4393-9230-fe4ac63aeda2" (UID: "edddf8ee-a85a-4393-9230-fe4ac63aeda2"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:30:54.703003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.702978 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edddf8ee-a85a-4393-9230-fe4ac63aeda2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "edddf8ee-a85a-4393-9230-fe4ac63aeda2" (UID: "edddf8ee-a85a-4393-9230-fe4ac63aeda2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:30:54.703382 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.703364 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kube-api-access-xn2l7" (OuterVolumeSpecName: "kube-api-access-xn2l7") pod "edddf8ee-a85a-4393-9230-fe4ac63aeda2" (UID: "edddf8ee-a85a-4393-9230-fe4ac63aeda2"). InnerVolumeSpecName "kube-api-access-xn2l7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:30:54.802064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.802019 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/edddf8ee-a85a-4393-9230-fe4ac63aeda2-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:30:54.802064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.802066 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xn2l7\" (UniqueName: \"kubernetes.io/projected/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kube-api-access-xn2l7\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:30:54.802064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.802077 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/edddf8ee-a85a-4393-9230-fe4ac63aeda2-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:30:54.802273 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.802086 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edddf8ee-a85a-4393-9230-fe4ac63aeda2-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:30:54.830634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.830603 2576 generic.go:358] "Generic (PLEG): container finished" podID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerID="e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392" exitCode=0 Apr 25 00:30:54.830758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.830692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerDied","Data":"e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392"} Apr 25 00:30:54.830758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.830723 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" Apr 25 00:30:54.830758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.830732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4" event={"ID":"edddf8ee-a85a-4393-9230-fe4ac63aeda2","Type":"ContainerDied","Data":"6e2d2b9917a52db34aae11f52091813e624946880a7f967f9db0cec346483146"} Apr 25 00:30:54.830758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.830748 2576 scope.go:117] "RemoveContainer" containerID="35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff" Apr 25 00:30:54.840366 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.840348 2576 scope.go:117] "RemoveContainer" containerID="e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392" Apr 25 00:30:54.847511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.847494 2576 scope.go:117] "RemoveContainer" containerID="48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8" Apr 25 00:30:54.856095 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.856066 2576 scope.go:117] "RemoveContainer" containerID="35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff" Apr 25 00:30:54.856517 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:30:54.856376 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff\": container with ID starting with 35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff not found: ID does not exist" containerID="35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff" Apr 25 00:30:54.856517 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.856427 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff"} err="failed to get container status \"35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff\": rpc error: code = NotFound desc = could not find container \"35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff\": container with ID starting with 35a3bdb3367acadfcd26f4f3e72a8da164866dc71bea9c46e0812f5fed8f5aff not found: ID does not exist" Apr 25 00:30:54.856517 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.856453 2576 scope.go:117] "RemoveContainer" containerID="e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392" Apr 25 00:30:54.857232 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:30:54.857204 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392\": container with ID starting with e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392 not found: ID does not exist" containerID="e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392" Apr 25 00:30:54.857329 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.857242 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392"} err="failed to get container status \"e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392\": rpc error: code = NotFound desc = could not find container \"e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392\": container with ID starting with e06377f0f7833e1c328ef9e27f92726a0b4fff15d1144651961d8ed336e4e392 not found: ID does not exist" Apr 25 00:30:54.857329 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.857275 2576 scope.go:117] "RemoveContainer" containerID="48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8" Apr 25 00:30:54.857645 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:30:54.857622 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8\": container with ID starting with 48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8 not found: ID does not exist" containerID="48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8" Apr 25 00:30:54.857739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.857654 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8"} err="failed to get container status \"48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8\": rpc error: code = NotFound desc = could not find container \"48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8\": container with ID starting with 48ba0dc6c827ab6341437abd479ea01135a141fa9469df8b1d895cdeeafaf6b8 not found: ID does not exist" Apr 25 00:30:54.862686 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.862654 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4"] Apr 25 00:30:54.862857 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:54.862834 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-ftnm4"] Apr 25 00:30:54.869319 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:30:54.869296 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedddf8ee_a85a_4393_9230_fe4ac63aeda2.slice/crio-6e2d2b9917a52db34aae11f52091813e624946880a7f967f9db0cec346483146\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedddf8ee_a85a_4393_9230_fe4ac63aeda2.slice\": RecentStats: unable to find data in memory cache]" Apr 25 00:30:55.835087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:55.835052 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerID="bc911f5aa30dce6d10eb5e7ea8077c599dddcfea4c14235facac4443e9b85a54" exitCode=0 Apr 25 00:30:55.835446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:55.835093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerDied","Data":"bc911f5aa30dce6d10eb5e7ea8077c599dddcfea4c14235facac4443e9b85a54"} Apr 25 00:30:56.314879 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:56.314845 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" path="/var/lib/kubelet/pods/edddf8ee-a85a-4393-9230-fe4ac63aeda2/volumes" Apr 25 00:30:56.840109 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:56.840081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerStarted","Data":"566b208e4ad648cc166d80d2210d74fc8b5339fbc6453840ae914423f45c3c8c"} Apr 25 00:30:56.840109 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:56.840114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerStarted","Data":"dee3965e559de30662a49f6412d92bf60dcb0a00bbd93ae11a8ef558012817a2"} Apr 25 00:30:56.840528 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:56.840315 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:57.842979 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:57.842945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:30:57.844180 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:57.844152 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:30:58.845705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:30:58.845661 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:31:03.849607 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:03.849579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:31:03.850240 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:03.850210 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:31:03.868298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:03.868253 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podStartSLOduration=13.8682419 podStartE2EDuration="13.8682419s" podCreationTimestamp="2026-04-25 00:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:30:56.869149518 +0000 UTC m=+2217.150266437" watchObservedRunningTime="2026-04-25 00:31:03.8682419 +0000 UTC m=+2224.149358820" Apr 25 00:31:13.850975 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:13.850887 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:31:23.851017 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:23.850976 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:31:33.851036 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:33.850992 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:31:43.850340 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:43.850302 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:31:53.850544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:31:53.850502 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:32:03.851028 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:03.850979 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:32:13.851497 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:13.851468 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:32:20.538023 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.537988 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl"] Apr 25 00:32:20.538509 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.538320 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" containerID="cri-o://dee3965e559de30662a49f6412d92bf60dcb0a00bbd93ae11a8ef558012817a2" gracePeriod=30 Apr 25 00:32:20.538509 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.538384 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kube-rbac-proxy" containerID="cri-o://566b208e4ad648cc166d80d2210d74fc8b5339fbc6453840ae914423f45c3c8c" gracePeriod=30 Apr 25 00:32:20.642438 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642400 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb"] Apr 25 00:32:20.642815 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642796 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" Apr 25 00:32:20.642875 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642822 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" Apr 25 00:32:20.642875 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642844 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="storage-initializer" Apr 25 00:32:20.642875 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642853 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="storage-initializer" Apr 25 00:32:20.642875 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642872 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kube-rbac-proxy" Apr 25 00:32:20.643047 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642881 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kube-rbac-proxy" Apr 25 00:32:20.643047 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642978 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kserve-container" Apr 25 00:32:20.643047 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.642992 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="edddf8ee-a85a-4393-9230-fe4ac63aeda2" containerName="kube-rbac-proxy" Apr 25 00:32:20.646137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.646116 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.648726 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.648700 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 25 00:32:20.648832 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.648708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 25 00:32:20.654083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.654049 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb"] Apr 25 00:32:20.720832 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.720799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.721000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.720848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8e9a252-6a4d-4c75-8baf-2f53659367e5-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.721000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.720945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8e9a252-6a4d-4c75-8baf-2f53659367e5-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.721000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.720963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnj6f\" (UniqueName: \"kubernetes.io/projected/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kube-api-access-bnj6f\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.821417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.821322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8e9a252-6a4d-4c75-8baf-2f53659367e5-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.821417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.821361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnj6f\" (UniqueName: \"kubernetes.io/projected/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kube-api-access-bnj6f\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.821417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.821378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.821417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.821404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8e9a252-6a4d-4c75-8baf-2f53659367e5-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.821894 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.821872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.822147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.822123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8e9a252-6a4d-4c75-8baf-2f53659367e5-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.824425 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.824402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8e9a252-6a4d-4c75-8baf-2f53659367e5-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.829521 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.829496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnj6f\" (UniqueName: \"kubernetes.io/projected/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kube-api-access-bnj6f\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:20.958319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:20.958287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:21.068698 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:21.068664 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerID="566b208e4ad648cc166d80d2210d74fc8b5339fbc6453840ae914423f45c3c8c" exitCode=2 Apr 25 00:32:21.068857 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:21.068739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerDied","Data":"566b208e4ad648cc166d80d2210d74fc8b5339fbc6453840ae914423f45c3c8c"} Apr 25 00:32:21.082549 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:21.082526 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb"] Apr 25 00:32:21.084467 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:32:21.084442 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e9a252_6a4d_4c75_8baf_2f53659367e5.slice/crio-ea2708bd072a61d80c9295d504a76a02426a3c65f4f3a78ed147467242e316d1 WatchSource:0}: Error finding container ea2708bd072a61d80c9295d504a76a02426a3c65f4f3a78ed147467242e316d1: Status 404 returned error can't find the container with id ea2708bd072a61d80c9295d504a76a02426a3c65f4f3a78ed147467242e316d1 Apr 25 00:32:21.086107 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:21.086092 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:32:22.073460 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:22.073425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerStarted","Data":"7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9"} Apr 25 00:32:22.073460 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:22.073458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerStarted","Data":"ea2708bd072a61d80c9295d504a76a02426a3c65f4f3a78ed147467242e316d1"} Apr 25 00:32:23.846092 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:23.846042 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 25 00:32:23.850339 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:23.850313 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 25 00:32:25.083273 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:25.083176 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerID="7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9" exitCode=0 Apr 25 00:32:25.083273 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:25.083250 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerDied","Data":"7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9"} Apr 25 00:32:26.088982 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.088948 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerID="dee3965e559de30662a49f6412d92bf60dcb0a00bbd93ae11a8ef558012817a2" exitCode=0 Apr 25 00:32:26.089367 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.089018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerDied","Data":"dee3965e559de30662a49f6412d92bf60dcb0a00bbd93ae11a8ef558012817a2"} Apr 25 00:32:26.090932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.090891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerStarted","Data":"a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36"} Apr 25 00:32:26.091042 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.090936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerStarted","Data":"dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61"} Apr 25 00:32:26.091187 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.091171 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:26.091249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.091231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:32:26.110167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.110115 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podStartSLOduration=6.110097684 podStartE2EDuration="6.110097684s" podCreationTimestamp="2026-04-25 00:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:32:26.107907867 +0000 UTC m=+2306.389024777" watchObservedRunningTime="2026-04-25 00:32:26.110097684 +0000 UTC m=+2306.391214605" Apr 25 00:32:26.181053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.181030 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:32:26.261511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.261484 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghmz9\" (UniqueName: \"kubernetes.io/projected/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kube-api-access-ghmz9\") pod \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " Apr 25 00:32:26.261653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.261521 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls\") pod \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " Apr 25 00:32:26.261653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.261543 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kserve-provision-location\") pod \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " Apr 25 00:32:26.261653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.261574 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\" (UID: \"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56\") " Apr 25 00:32:26.261956 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.261899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" (UID: "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:32:26.262092 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.261982 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" (UID: "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:32:26.263768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.263747 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kube-api-access-ghmz9" (OuterVolumeSpecName: "kube-api-access-ghmz9") pod "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" (UID: "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56"). InnerVolumeSpecName "kube-api-access-ghmz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:32:26.263841 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.263747 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" (UID: "1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:32:26.362572 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.362543 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghmz9\" (UniqueName: \"kubernetes.io/projected/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kube-api-access-ghmz9\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:32:26.362572 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.362567 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:32:26.362696 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.362576 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:32:26.362696 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:26.362586 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:32:27.095734 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.095702 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" Apr 25 00:32:27.096239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.095702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl" event={"ID":"1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56","Type":"ContainerDied","Data":"bf94e196b4e671da1621354ba847c947460d1ae7c5527f348ba67c5134a60998"} Apr 25 00:32:27.096239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.095814 2576 scope.go:117] "RemoveContainer" containerID="566b208e4ad648cc166d80d2210d74fc8b5339fbc6453840ae914423f45c3c8c" Apr 25 00:32:27.103613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.103592 2576 scope.go:117] "RemoveContainer" containerID="dee3965e559de30662a49f6412d92bf60dcb0a00bbd93ae11a8ef558012817a2" Apr 25 00:32:27.110399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.110381 2576 scope.go:117] "RemoveContainer" containerID="bc911f5aa30dce6d10eb5e7ea8077c599dddcfea4c14235facac4443e9b85a54" Apr 25 00:32:27.113066 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.113041 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl"] Apr 25 00:32:27.118001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:27.117975 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-khqdl"] Apr 25 00:32:28.315179 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:28.315147 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" path="/var/lib/kubelet/pods/1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56/volumes" Apr 25 00:32:32.105036 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:32:32.105009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:33:02.106386 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:02.106347 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:33:12.105996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:12.105956 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:33:22.105710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:22.105667 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:33:32.105755 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:32.105717 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:33:42.109094 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:42.109059 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:33:50.735928 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.735876 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb"] Apr 25 00:33:50.736375 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.736288 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" containerID="cri-o://dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61" gracePeriod=30 Apr 25 00:33:50.736448 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.736368 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kube-rbac-proxy" containerID="cri-o://a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36" gracePeriod=30 Apr 25 00:33:50.852780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.852751 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s"] Apr 25 00:33:50.853107 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853092 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" Apr 25 00:33:50.853159 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853110 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" Apr 25 00:33:50.853159 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853125 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kube-rbac-proxy" Apr 25 00:33:50.853159 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853131 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kube-rbac-proxy" Apr 25 00:33:50.853159 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853138 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="storage-initializer" Apr 25 00:33:50.853159 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853143 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="storage-initializer" Apr 25 00:33:50.853317 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853192 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kserve-container" Apr 25 00:33:50.853317 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.853198 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f88b6ea-6bbd-41a8-9d09-80d1eea0eb56" containerName="kube-rbac-proxy" Apr 25 00:33:50.856157 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.856139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:50.858572 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.858547 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 25 00:33:50.858664 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.858574 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 25 00:33:50.867153 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.867132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s"] Apr 25 00:33:50.962518 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.962485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:50.962668 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.962605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:50.962668 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.962630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:50.962668 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:50.962662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8fz\" (UniqueName: \"kubernetes.io/projected/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kube-api-access-cv8fz\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.063884 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.063856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.064084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.063890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.064084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.063932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8fz\" (UniqueName: \"kubernetes.io/projected/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kube-api-access-cv8fz\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.064084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.063974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.064389 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.064362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.064594 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.064575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.066464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.066446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.072241 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.072222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8fz\" (UniqueName: \"kubernetes.io/projected/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kube-api-access-cv8fz\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.166631 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.166600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:51.287021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.286995 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s"] Apr 25 00:33:51.289183 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:33:51.289149 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46480ac_83fd_413d_9aa8_38c8b6ee2c78.slice/crio-c53215fb235a4bef3efdda20f3df784bfa5fdbf8e6f5de463919ec9a181f9d14 WatchSource:0}: Error finding container c53215fb235a4bef3efdda20f3df784bfa5fdbf8e6f5de463919ec9a181f9d14: Status 404 returned error can't find the container with id c53215fb235a4bef3efdda20f3df784bfa5fdbf8e6f5de463919ec9a181f9d14 Apr 25 00:33:51.336497 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.336462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerStarted","Data":"c53215fb235a4bef3efdda20f3df784bfa5fdbf8e6f5de463919ec9a181f9d14"} Apr 25 00:33:51.338120 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.338097 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerID="a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36" exitCode=2 Apr 25 00:33:51.338206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:51.338129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerDied","Data":"a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36"} Apr 25 00:33:52.096671 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:52.096620 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 25 00:33:52.106147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:52.106117 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 25 00:33:52.342078 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:52.342034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerStarted","Data":"0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e"} Apr 25 00:33:55.276762 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.276742 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:33:55.351779 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.351709 2576 generic.go:358] "Generic (PLEG): container finished" podID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerID="0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e" exitCode=0 Apr 25 00:33:55.351930 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.351783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerDied","Data":"0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e"} Apr 25 00:33:55.353470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.353449 2576 generic.go:358] "Generic (PLEG): container finished" podID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerID="dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61" exitCode=0 Apr 25 00:33:55.353558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.353502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerDied","Data":"dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61"} Apr 25 00:33:55.353558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.353515 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" Apr 25 00:33:55.353558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.353526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb" event={"ID":"d8e9a252-6a4d-4c75-8baf-2f53659367e5","Type":"ContainerDied","Data":"ea2708bd072a61d80c9295d504a76a02426a3c65f4f3a78ed147467242e316d1"} Apr 25 00:33:55.353558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.353540 2576 scope.go:117] "RemoveContainer" containerID="a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36" Apr 25 00:33:55.361444 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.361428 2576 scope.go:117] "RemoveContainer" containerID="dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61" Apr 25 00:33:55.368612 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.368593 2576 scope.go:117] "RemoveContainer" containerID="7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9" Apr 25 00:33:55.378969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.378952 2576 scope.go:117] "RemoveContainer" containerID="a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36" Apr 25 00:33:55.379263 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:33:55.379240 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36\": container with ID starting with a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36 not found: ID does not exist" containerID="a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36" Apr 25 00:33:55.379327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.379275 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36"} err="failed to get container status \"a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36\": rpc error: code = NotFound desc = could not find container \"a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36\": container with ID starting with a1f28c8a508e1408c76d250277c5ef58c93146b359df130bd085e70eac983c36 not found: ID does not exist" Apr 25 00:33:55.379327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.379303 2576 scope.go:117] "RemoveContainer" containerID="dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61" Apr 25 00:33:55.379537 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:33:55.379519 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61\": container with ID starting with dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61 not found: ID does not exist" containerID="dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61" Apr 25 00:33:55.379581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.379544 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61"} err="failed to get container status \"dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61\": rpc error: code = NotFound desc = could not find container \"dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61\": container with ID starting with dc765ae9965853b12c013db9ffbe7a70e6c1a70224ed96e012be60527ffabb61 not found: ID does not exist" Apr 25 00:33:55.379581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.379560 2576 scope.go:117] "RemoveContainer" containerID="7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9" Apr 25 00:33:55.379756 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:33:55.379742 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9\": container with ID starting with 7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9 not found: ID does not exist" containerID="7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9" Apr 25 00:33:55.379797 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.379760 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9"} err="failed to get container status \"7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9\": rpc error: code = NotFound desc = could not find container \"7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9\": container with ID starting with 7e3c08828aedb88e59361bad9780c2fc74f35dda49e95f9de8e0955d6384dde9 not found: ID does not exist" Apr 25 00:33:55.401508 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.401480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kserve-provision-location\") pod \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " Apr 25 00:33:55.401590 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.401565 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnj6f\" (UniqueName: \"kubernetes.io/projected/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kube-api-access-bnj6f\") pod \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " Apr 25 00:33:55.401633 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.401598 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8e9a252-6a4d-4c75-8baf-2f53659367e5-proxy-tls\") pod \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " Apr 25 00:33:55.401682 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.401640 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8e9a252-6a4d-4c75-8baf-2f53659367e5-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\" (UID: \"d8e9a252-6a4d-4c75-8baf-2f53659367e5\") " Apr 25 00:33:55.401854 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.401790 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d8e9a252-6a4d-4c75-8baf-2f53659367e5" (UID: "d8e9a252-6a4d-4c75-8baf-2f53659367e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:33:55.401992 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.401969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e9a252-6a4d-4c75-8baf-2f53659367e5-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "d8e9a252-6a4d-4c75-8baf-2f53659367e5" (UID: "d8e9a252-6a4d-4c75-8baf-2f53659367e5"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:33:55.403602 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.403581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e9a252-6a4d-4c75-8baf-2f53659367e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d8e9a252-6a4d-4c75-8baf-2f53659367e5" (UID: "d8e9a252-6a4d-4c75-8baf-2f53659367e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:33:55.403701 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.403682 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kube-api-access-bnj6f" (OuterVolumeSpecName: "kube-api-access-bnj6f") pod "d8e9a252-6a4d-4c75-8baf-2f53659367e5" (UID: "d8e9a252-6a4d-4c75-8baf-2f53659367e5"). InnerVolumeSpecName "kube-api-access-bnj6f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:33:55.502531 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.502494 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:33:55.502531 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.502522 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bnj6f\" (UniqueName: \"kubernetes.io/projected/d8e9a252-6a4d-4c75-8baf-2f53659367e5-kube-api-access-bnj6f\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:33:55.502531 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.502532 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d8e9a252-6a4d-4c75-8baf-2f53659367e5-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:33:55.502787 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.502543 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d8e9a252-6a4d-4c75-8baf-2f53659367e5-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:33:55.674804 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.674773 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb"] Apr 25 00:33:55.678931 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:55.678893 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-kncwb"] Apr 25 00:33:56.317625 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:56.317595 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" path="/var/lib/kubelet/pods/d8e9a252-6a4d-4c75-8baf-2f53659367e5/volumes" Apr 25 00:33:56.358782 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:56.358710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerStarted","Data":"c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3"} Apr 25 00:33:56.358782 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:56.358739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerStarted","Data":"406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0"} Apr 25 00:33:56.358998 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:56.358953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:56.358998 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:56.358975 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:33:56.379781 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:33:56.379731 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podStartSLOduration=6.379718136 podStartE2EDuration="6.379718136s" podCreationTimestamp="2026-04-25 00:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:33:56.377500087 +0000 UTC m=+2396.658617006" watchObservedRunningTime="2026-04-25 00:33:56.379718136 +0000 UTC m=+2396.660835056" Apr 25 00:34:02.367500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:34:02.367473 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:34:32.368982 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:34:32.368940 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 25 00:34:37.111308 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:34:37.111277 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:34:37.115016 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:34:37.114992 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:34:42.368763 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:34:42.368725 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 25 00:34:52.368119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:34:52.368082 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 25 00:35:02.368725 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:02.368684 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 25 00:35:10.316994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:10.316960 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:35:10.953032 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:10.953001 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s"] Apr 25 00:35:11.031981 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.031951 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk"] Apr 25 00:35:11.032222 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032211 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="storage-initializer" Apr 25 00:35:11.032272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032224 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="storage-initializer" Apr 25 00:35:11.032272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032236 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kube-rbac-proxy" Apr 25 00:35:11.032272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032241 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kube-rbac-proxy" Apr 25 00:35:11.032272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032256 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" Apr 25 00:35:11.032272 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032262 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" Apr 25 00:35:11.032451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032306 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kserve-container" Apr 25 00:35:11.032451 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.032316 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8e9a252-6a4d-4c75-8baf-2f53659367e5" containerName="kube-rbac-proxy" Apr 25 00:35:11.035173 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.035158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.038331 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.038310 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 25 00:35:11.038739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.038720 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 25 00:35:11.060990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.060960 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk"] Apr 25 00:35:11.138146 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.138113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2sr\" (UniqueName: \"kubernetes.io/projected/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kube-api-access-ss2sr\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.138299 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.138157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.138299 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.138177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.138299 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.138236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.238984 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.238896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.238984 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.238956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.239173 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.238998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2sr\" (UniqueName: \"kubernetes.io/projected/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kube-api-access-ss2sr\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.239173 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.239023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.239173 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:35:11.239121 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-serving-cert: secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 25 00:35:11.239276 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:35:11.239184 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls podName:04fb9f8d-f6ca-42e4-9682-1895e9a0a871 nodeName:}" failed. No retries permitted until 2026-04-25 00:35:11.739169095 +0000 UTC m=+2472.020285999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls") pod "isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" (UID: "04fb9f8d-f6ca-42e4-9682-1895e9a0a871") : secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 25 00:35:11.239340 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.239321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.239640 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.239622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.247701 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.247678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2sr\" (UniqueName: \"kubernetes.io/projected/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kube-api-access-ss2sr\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.566071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.566003 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" containerID="cri-o://406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0" gracePeriod=30 Apr 25 00:35:11.566552 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.566055 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kube-rbac-proxy" containerID="cri-o://c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3" gracePeriod=30 Apr 25 00:35:11.743224 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.743197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.745497 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.745472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:11.945534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:11.945465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:12.062672 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:12.062558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk"] Apr 25 00:35:12.064864 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:35:12.064832 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04fb9f8d_f6ca_42e4_9682_1895e9a0a871.slice/crio-898449f5d2b303b414dc39498d16d9e7700bc3521e2d9163d6f7718eeabb20bc WatchSource:0}: Error finding container 898449f5d2b303b414dc39498d16d9e7700bc3521e2d9163d6f7718eeabb20bc: Status 404 returned error can't find the container with id 898449f5d2b303b414dc39498d16d9e7700bc3521e2d9163d6f7718eeabb20bc Apr 25 00:35:12.362811 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:12.362772 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 25 00:35:12.571305 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:12.571273 2576 generic.go:358] "Generic (PLEG): container finished" podID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerID="c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3" exitCode=2 Apr 25 00:35:12.571723 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:12.571346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerDied","Data":"c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3"} Apr 25 00:35:12.572641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:12.572610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerStarted","Data":"e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed"} Apr 25 00:35:12.572751 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:12.572649 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerStarted","Data":"898449f5d2b303b414dc39498d16d9e7700bc3521e2d9163d6f7718eeabb20bc"} Apr 25 00:35:16.002534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.002512 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:35:16.074040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.074000 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kserve-provision-location\") pod \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " Apr 25 00:35:16.074177 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.074058 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-proxy-tls\") pod \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " Apr 25 00:35:16.074177 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.074111 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv8fz\" (UniqueName: \"kubernetes.io/projected/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kube-api-access-cv8fz\") pod \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " Apr 25 00:35:16.074177 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.074153 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\" (UID: \"c46480ac-83fd-413d-9aa8-38c8b6ee2c78\") " Apr 25 00:35:16.074381 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.074350 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c46480ac-83fd-413d-9aa8-38c8b6ee2c78" (UID: "c46480ac-83fd-413d-9aa8-38c8b6ee2c78"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:35:16.074579 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.074555 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "c46480ac-83fd-413d-9aa8-38c8b6ee2c78" (UID: "c46480ac-83fd-413d-9aa8-38c8b6ee2c78"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:35:16.076237 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.076219 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kube-api-access-cv8fz" (OuterVolumeSpecName: "kube-api-access-cv8fz") pod "c46480ac-83fd-413d-9aa8-38c8b6ee2c78" (UID: "c46480ac-83fd-413d-9aa8-38c8b6ee2c78"). InnerVolumeSpecName "kube-api-access-cv8fz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:35:16.076302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.076241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c46480ac-83fd-413d-9aa8-38c8b6ee2c78" (UID: "c46480ac-83fd-413d-9aa8-38c8b6ee2c78"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:35:16.175231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.175163 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:35:16.175231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.175188 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:35:16.175231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.175198 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cv8fz\" (UniqueName: \"kubernetes.io/projected/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-kube-api-access-cv8fz\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:35:16.175231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.175208 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c46480ac-83fd-413d-9aa8-38c8b6ee2c78-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:35:16.584954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.584898 2576 generic.go:358] "Generic (PLEG): container finished" podID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerID="406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0" exitCode=0 Apr 25 00:35:16.585119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.584991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerDied","Data":"406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0"} Apr 25 00:35:16.585119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.585044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" event={"ID":"c46480ac-83fd-413d-9aa8-38c8b6ee2c78","Type":"ContainerDied","Data":"c53215fb235a4bef3efdda20f3df784bfa5fdbf8e6f5de463919ec9a181f9d14"} Apr 25 00:35:16.585119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.585004 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s" Apr 25 00:35:16.585119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.585068 2576 scope.go:117] "RemoveContainer" containerID="c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3" Apr 25 00:35:16.586442 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.586419 2576 generic.go:358] "Generic (PLEG): container finished" podID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerID="e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed" exitCode=0 Apr 25 00:35:16.586559 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.586461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerDied","Data":"e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed"} Apr 25 00:35:16.593131 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.593118 2576 scope.go:117] "RemoveContainer" containerID="406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0" Apr 25 00:35:16.600143 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.600113 2576 scope.go:117] "RemoveContainer" containerID="0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e" Apr 25 00:35:16.613618 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.612677 2576 scope.go:117] "RemoveContainer" containerID="c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3" Apr 25 00:35:16.613618 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:35:16.613241 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3\": container with ID starting with c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3 not found: ID does not exist" containerID="c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3" Apr 25 00:35:16.613618 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.613280 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3"} err="failed to get container status \"c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3\": rpc error: code = NotFound desc = could not find container \"c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3\": container with ID starting with c1cddb94f7f694b31b08fb807a58b2a0bf5a9870af3566724788f03b1576f3e3 not found: ID does not exist" Apr 25 00:35:16.613618 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.613366 2576 scope.go:117] "RemoveContainer" containerID="406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0" Apr 25 00:35:16.614174 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:35:16.614152 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0\": container with ID starting with 406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0 not found: ID does not exist" containerID="406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0" Apr 25 00:35:16.614287 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.614178 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0"} err="failed to get container status \"406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0\": rpc error: code = NotFound desc = could not find container \"406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0\": container with ID starting with 406fa5e2a5e96dd883233bc34fd22bc5bb3faf3a65cb38b0f62da8bfb20566b0 not found: ID does not exist" Apr 25 00:35:16.614287 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.614193 2576 scope.go:117] "RemoveContainer" containerID="0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e" Apr 25 00:35:16.615066 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:35:16.615027 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e\": container with ID starting with 0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e not found: ID does not exist" containerID="0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e" Apr 25 00:35:16.615213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.615191 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e"} err="failed to get container status \"0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e\": rpc error: code = NotFound desc = could not find container \"0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e\": container with ID starting with 0b84081723ca1453c8aa4d244a5f5ec331449b973e67dc60baf7822bc3f8113e not found: ID does not exist" Apr 25 00:35:16.624403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.624383 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s"] Apr 25 00:35:16.627959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:16.627938 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-9dn4s"] Apr 25 00:35:17.591792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:17.591758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerStarted","Data":"3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58"} Apr 25 00:35:17.591792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:17.591791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerStarted","Data":"66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8"} Apr 25 00:35:17.592252 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:17.592037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:17.592252 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:17.592099 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:17.610904 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:17.610860 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podStartSLOduration=6.610848503 podStartE2EDuration="6.610848503s" podCreationTimestamp="2026-04-25 00:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:35:17.60949526 +0000 UTC m=+2477.890612180" watchObservedRunningTime="2026-04-25 00:35:17.610848503 +0000 UTC m=+2477.891965423" Apr 25 00:35:18.315622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:18.315583 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" path="/var/lib/kubelet/pods/c46480ac-83fd-413d-9aa8-38c8b6ee2c78/volumes" Apr 25 00:35:23.600513 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:23.600482 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:35:53.601588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:35:53.601546 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 25 00:36:03.601937 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:03.601874 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 25 00:36:13.601288 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:13.601239 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 25 00:36:23.602080 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:23.602034 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 25 00:36:29.315147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:29.315106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:36:31.091281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:31.091217 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk"] Apr 25 00:36:31.092271 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:31.092216 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kube-rbac-proxy" containerID="cri-o://3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58" gracePeriod=30 Apr 25 00:36:31.092537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:31.092196 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" containerID="cri-o://66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8" gracePeriod=30 Apr 25 00:36:31.793276 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:31.793242 2576 generic.go:358] "Generic (PLEG): container finished" podID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerID="3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58" exitCode=2 Apr 25 00:36:31.793619 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:31.793321 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerDied","Data":"3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58"} Apr 25 00:36:33.260359 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260325 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d"] Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260607 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="storage-initializer" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260617 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="storage-initializer" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260634 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260639 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260649 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kube-rbac-proxy" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260656 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kube-rbac-proxy" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260697 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kube-rbac-proxy" Apr 25 00:36:33.260768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.260705 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c46480ac-83fd-413d-9aa8-38c8b6ee2c78" containerName="kserve-container" Apr 25 00:36:33.263574 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.263553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.265873 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.265854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 25 00:36:33.265946 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.265877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 25 00:36:33.272358 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.272336 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d"] Apr 25 00:36:33.289580 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.289550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.289689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.289591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r78jp\" (UniqueName: \"kubernetes.io/projected/78f608da-2859-43f9-9a66-4d094bb5ee48-kube-api-access-r78jp\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.289689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.289651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78f608da-2859-43f9-9a66-4d094bb5ee48-kserve-provision-location\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.289782 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.289717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78f608da-2859-43f9-9a66-4d094bb5ee48-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.391073 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.391037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78f608da-2859-43f9-9a66-4d094bb5ee48-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.391246 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.391100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.391246 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.391133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r78jp\" (UniqueName: \"kubernetes.io/projected/78f608da-2859-43f9-9a66-4d094bb5ee48-kube-api-access-r78jp\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.391246 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.391161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78f608da-2859-43f9-9a66-4d094bb5ee48-kserve-provision-location\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.391429 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:36:33.391256 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-predictor-serving-cert: secret "isvc-sklearn-predictor-serving-cert" not found Apr 25 00:36:33.391429 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:36:33.391346 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls podName:78f608da-2859-43f9-9a66-4d094bb5ee48 nodeName:}" failed. No retries permitted until 2026-04-25 00:36:33.891321416 +0000 UTC m=+2554.172438319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls") pod "isvc-sklearn-predictor-66877cc755-gpc2d" (UID: "78f608da-2859-43f9-9a66-4d094bb5ee48") : secret "isvc-sklearn-predictor-serving-cert" not found Apr 25 00:36:33.391610 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.391586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78f608da-2859-43f9-9a66-4d094bb5ee48-kserve-provision-location\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.391759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.391744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78f608da-2859-43f9-9a66-4d094bb5ee48-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.400147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.400123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r78jp\" (UniqueName: \"kubernetes.io/projected/78f608da-2859-43f9-9a66-4d094bb5ee48-kube-api-access-r78jp\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.596383 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.596336 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.46:8643/healthz\": dial tcp 10.134.0.46:8643: connect: connection refused" Apr 25 00:36:33.895495 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.895421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:33.898002 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:33.897977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls\") pod \"isvc-sklearn-predictor-66877cc755-gpc2d\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:34.174941 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:34.174836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:34.292488 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:34.292437 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d"] Apr 25 00:36:34.295634 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:36:34.295604 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f608da_2859_43f9_9a66_4d094bb5ee48.slice/crio-5bf960d901db691fc4296b2e1d4fd471b523ee7c19a8e4d992253abd557b1447 WatchSource:0}: Error finding container 5bf960d901db691fc4296b2e1d4fd471b523ee7c19a8e4d992253abd557b1447: Status 404 returned error can't find the container with id 5bf960d901db691fc4296b2e1d4fd471b523ee7c19a8e4d992253abd557b1447 Apr 25 00:36:34.802809 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:34.802764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerStarted","Data":"4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555"} Apr 25 00:36:34.802973 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:34.802819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerStarted","Data":"5bf960d901db691fc4296b2e1d4fd471b523ee7c19a8e4d992253abd557b1447"} Apr 25 00:36:36.228694 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.228672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:36:36.314499 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.314472 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " Apr 25 00:36:36.314664 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.314505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls\") pod \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " Apr 25 00:36:36.314664 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.314558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kserve-provision-location\") pod \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " Apr 25 00:36:36.314664 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.314614 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss2sr\" (UniqueName: \"kubernetes.io/projected/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kube-api-access-ss2sr\") pod \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\" (UID: \"04fb9f8d-f6ca-42e4-9682-1895e9a0a871\") " Apr 25 00:36:36.314885 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.314864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "04fb9f8d-f6ca-42e4-9682-1895e9a0a871" (UID: "04fb9f8d-f6ca-42e4-9682-1895e9a0a871"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:36:36.314987 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.314967 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04fb9f8d-f6ca-42e4-9682-1895e9a0a871" (UID: "04fb9f8d-f6ca-42e4-9682-1895e9a0a871"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:36:36.316874 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.316843 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kube-api-access-ss2sr" (OuterVolumeSpecName: "kube-api-access-ss2sr") pod "04fb9f8d-f6ca-42e4-9682-1895e9a0a871" (UID: "04fb9f8d-f6ca-42e4-9682-1895e9a0a871"). InnerVolumeSpecName "kube-api-access-ss2sr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:36:36.316990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.316898 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "04fb9f8d-f6ca-42e4-9682-1895e9a0a871" (UID: "04fb9f8d-f6ca-42e4-9682-1895e9a0a871"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:36:36.416089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.416018 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:36:36.416089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.416045 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss2sr\" (UniqueName: \"kubernetes.io/projected/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-kube-api-access-ss2sr\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:36:36.416089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.416060 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:36:36.416089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.416074 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04fb9f8d-f6ca-42e4-9682-1895e9a0a871-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:36:36.809369 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.809339 2576 generic.go:358] "Generic (PLEG): container finished" podID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerID="66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8" exitCode=0 Apr 25 00:36:36.809537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.809417 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" Apr 25 00:36:36.809537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.809424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerDied","Data":"66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8"} Apr 25 00:36:36.809537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.809469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk" event={"ID":"04fb9f8d-f6ca-42e4-9682-1895e9a0a871","Type":"ContainerDied","Data":"898449f5d2b303b414dc39498d16d9e7700bc3521e2d9163d6f7718eeabb20bc"} Apr 25 00:36:36.809537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.809490 2576 scope.go:117] "RemoveContainer" containerID="3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58" Apr 25 00:36:36.817671 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.817622 2576 scope.go:117] "RemoveContainer" containerID="66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8" Apr 25 00:36:36.824539 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.824522 2576 scope.go:117] "RemoveContainer" containerID="e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed" Apr 25 00:36:36.830588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.830563 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk"] Apr 25 00:36:36.833674 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.833655 2576 scope.go:117] "RemoveContainer" containerID="3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58" Apr 25 00:36:36.834213 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:36:36.834189 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58\": container with ID starting with 3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58 not found: ID does not exist" containerID="3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58" Apr 25 00:36:36.834313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.834222 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58"} err="failed to get container status \"3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58\": rpc error: code = NotFound desc = could not find container \"3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58\": container with ID starting with 3e71bc629d19c0d369087fe8f75093062f8c99029bd85881270ad9937433ca58 not found: ID does not exist" Apr 25 00:36:36.834313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.834241 2576 scope.go:117] "RemoveContainer" containerID="66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8" Apr 25 00:36:36.834474 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:36:36.834458 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8\": container with ID starting with 66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8 not found: ID does not exist" containerID="66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8" Apr 25 00:36:36.834524 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.834478 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8"} err="failed to get container status \"66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8\": rpc error: code = NotFound desc = could not find container \"66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8\": container with ID starting with 66723b441fbdb8a35f63b895d53ed045557bb091bcff23edb171276bc6fb16d8 not found: ID does not exist" Apr 25 00:36:36.834524 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.834490 2576 scope.go:117] "RemoveContainer" containerID="e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed" Apr 25 00:36:36.834697 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:36:36.834680 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed\": container with ID starting with e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed not found: ID does not exist" containerID="e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed" Apr 25 00:36:36.834738 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.834700 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed"} err="failed to get container status \"e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed\": rpc error: code = NotFound desc = could not find container \"e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed\": container with ID starting with e92e18625505f810e329f05451894d322db8a872bf1f8cf0eaf4df15ee2eceed not found: ID does not exist" Apr 25 00:36:36.839537 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:36.839515 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-zpbtk"] Apr 25 00:36:38.315212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:38.315186 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" path="/var/lib/kubelet/pods/04fb9f8d-f6ca-42e4-9682-1895e9a0a871/volumes" Apr 25 00:36:38.817144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:38.817112 2576 generic.go:358] "Generic (PLEG): container finished" podID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerID="4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555" exitCode=0 Apr 25 00:36:38.817144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:38.817155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerDied","Data":"4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555"} Apr 25 00:36:39.822289 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:39.822258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerStarted","Data":"4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488"} Apr 25 00:36:39.822289 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:39.822295 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerStarted","Data":"6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0"} Apr 25 00:36:39.822749 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:39.822590 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:39.822749 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:39.822738 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:39.823869 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:39.823841 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:36:39.840705 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:39.840668 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podStartSLOduration=6.840655012 podStartE2EDuration="6.840655012s" podCreationTimestamp="2026-04-25 00:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:36:39.839147381 +0000 UTC m=+2560.120264300" watchObservedRunningTime="2026-04-25 00:36:39.840655012 +0000 UTC m=+2560.121771932" Apr 25 00:36:40.824936 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:40.824883 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:36:45.829071 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:45.829044 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:36:45.829473 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:45.829443 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:36:55.830072 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:36:55.830019 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:37:05.829434 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:05.829393 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:37:15.829435 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:15.829350 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:37:25.829959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:25.829896 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:37:35.829934 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:35.829870 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:37:45.830402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:45.830372 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:37:53.355445 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.355413 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d"] Apr 25 00:37:53.355845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.355716 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" containerID="cri-o://6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0" gracePeriod=30 Apr 25 00:37:53.355845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.355749 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kube-rbac-proxy" containerID="cri-o://4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488" gracePeriod=30 Apr 25 00:37:53.453723 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.453696 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2"] Apr 25 00:37:53.454004 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.453991 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kube-rbac-proxy" Apr 25 00:37:53.454004 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454005 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kube-rbac-proxy" Apr 25 00:37:53.454108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454022 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="storage-initializer" Apr 25 00:37:53.454108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454028 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="storage-initializer" Apr 25 00:37:53.454108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454065 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" Apr 25 00:37:53.454108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454071 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" Apr 25 00:37:53.454241 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454112 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kube-rbac-proxy" Apr 25 00:37:53.454241 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.454122 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="04fb9f8d-f6ca-42e4-9682-1895e9a0a871" containerName="kserve-container" Apr 25 00:37:53.457158 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.457140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.459343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.459325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 25 00:37:53.459436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.459403 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 25 00:37:53.466083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.466040 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2"] Apr 25 00:37:53.475013 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.474991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.475123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.475065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.475123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.475089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.475123 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.475106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qr4k\" (UniqueName: \"kubernetes.io/projected/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kube-api-access-7qr4k\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.575796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.575768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.575986 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.575824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.575986 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.575864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.575986 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:37:53.575944 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-serving-cert: secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 25 00:37:53.576126 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:37:53.576013 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls podName:bb8a02cc-6671-4309-ac74-7025b0c6eb8b nodeName:}" failed. No retries permitted until 2026-04-25 00:37:54.0759922 +0000 UTC m=+2634.357109098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls") pod "sklearn-v2-mlserver-predictor-65d8664766-sdbl2" (UID: "bb8a02cc-6671-4309-ac74-7025b0c6eb8b") : secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 25 00:37:53.576126 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.576033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qr4k\" (UniqueName: \"kubernetes.io/projected/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kube-api-access-7qr4k\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.576218 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.576167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.576424 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.576405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:53.584708 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:53.584687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qr4k\" (UniqueName: \"kubernetes.io/projected/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kube-api-access-7qr4k\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:54.043770 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.043737 2576 generic.go:358] "Generic (PLEG): container finished" podID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerID="4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488" exitCode=2 Apr 25 00:37:54.043969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.043811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerDied","Data":"4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488"} Apr 25 00:37:54.079291 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.079262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:54.081775 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.081753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sdbl2\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:54.367541 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.367462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:37:54.484156 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.484097 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2"] Apr 25 00:37:54.487072 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:37:54.487036 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb8a02cc_6671_4309_ac74_7025b0c6eb8b.slice/crio-103446a4dc44855f448cb40934e2e672630581359ebd3a472b50ec832400c1e2 WatchSource:0}: Error finding container 103446a4dc44855f448cb40934e2e672630581359ebd3a472b50ec832400c1e2: Status 404 returned error can't find the container with id 103446a4dc44855f448cb40934e2e672630581359ebd3a472b50ec832400c1e2 Apr 25 00:37:54.488951 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:54.488935 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:37:55.048326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:55.048285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerStarted","Data":"9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6"} Apr 25 00:37:55.048326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:55.048332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerStarted","Data":"103446a4dc44855f448cb40934e2e672630581359ebd3a472b50ec832400c1e2"} Apr 25 00:37:55.825690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:55.825646 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 25 00:37:55.829960 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:55.829933 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 25 00:37:57.296084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.296061 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:37:57.303741 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.303721 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls\") pod \"78f608da-2859-43f9-9a66-4d094bb5ee48\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " Apr 25 00:37:57.303861 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.303771 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78f608da-2859-43f9-9a66-4d094bb5ee48-kserve-provision-location\") pod \"78f608da-2859-43f9-9a66-4d094bb5ee48\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " Apr 25 00:37:57.303861 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.303824 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78f608da-2859-43f9-9a66-4d094bb5ee48-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"78f608da-2859-43f9-9a66-4d094bb5ee48\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " Apr 25 00:37:57.303861 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.303847 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r78jp\" (UniqueName: \"kubernetes.io/projected/78f608da-2859-43f9-9a66-4d094bb5ee48-kube-api-access-r78jp\") pod \"78f608da-2859-43f9-9a66-4d094bb5ee48\" (UID: \"78f608da-2859-43f9-9a66-4d094bb5ee48\") " Apr 25 00:37:57.304136 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.304108 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f608da-2859-43f9-9a66-4d094bb5ee48-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "78f608da-2859-43f9-9a66-4d094bb5ee48" (UID: "78f608da-2859-43f9-9a66-4d094bb5ee48"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:37:57.304242 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.304113 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f608da-2859-43f9-9a66-4d094bb5ee48-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "78f608da-2859-43f9-9a66-4d094bb5ee48" (UID: "78f608da-2859-43f9-9a66-4d094bb5ee48"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:37:57.305853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.305830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "78f608da-2859-43f9-9a66-4d094bb5ee48" (UID: "78f608da-2859-43f9-9a66-4d094bb5ee48"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:37:57.305962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.305940 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f608da-2859-43f9-9a66-4d094bb5ee48-kube-api-access-r78jp" (OuterVolumeSpecName: "kube-api-access-r78jp") pod "78f608da-2859-43f9-9a66-4d094bb5ee48" (UID: "78f608da-2859-43f9-9a66-4d094bb5ee48"). InnerVolumeSpecName "kube-api-access-r78jp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:37:57.404613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.404546 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/78f608da-2859-43f9-9a66-4d094bb5ee48-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:37:57.404613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.404572 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78f608da-2859-43f9-9a66-4d094bb5ee48-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:37:57.404613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.404585 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r78jp\" (UniqueName: \"kubernetes.io/projected/78f608da-2859-43f9-9a66-4d094bb5ee48-kube-api-access-r78jp\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:37:57.404613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:57.404594 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78f608da-2859-43f9-9a66-4d094bb5ee48-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:37:58.061277 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.061241 2576 generic.go:358] "Generic (PLEG): container finished" podID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerID="6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0" exitCode=0 Apr 25 00:37:58.061474 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.061308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerDied","Data":"6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0"} Apr 25 00:37:58.061474 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.061318 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" Apr 25 00:37:58.061474 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.061341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d" event={"ID":"78f608da-2859-43f9-9a66-4d094bb5ee48","Type":"ContainerDied","Data":"5bf960d901db691fc4296b2e1d4fd471b523ee7c19a8e4d992253abd557b1447"} Apr 25 00:37:58.061474 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.061359 2576 scope.go:117] "RemoveContainer" containerID="4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488" Apr 25 00:37:58.069632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.069613 2576 scope.go:117] "RemoveContainer" containerID="6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0" Apr 25 00:37:58.077621 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.077595 2576 scope.go:117] "RemoveContainer" containerID="4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555" Apr 25 00:37:58.082661 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.082639 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d"] Apr 25 00:37:58.085732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.085715 2576 scope.go:117] "RemoveContainer" containerID="4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488" Apr 25 00:37:58.086072 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:37:58.086045 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488\": container with ID starting with 4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488 not found: ID does not exist" containerID="4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488" Apr 25 00:37:58.086152 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.086082 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488"} err="failed to get container status \"4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488\": rpc error: code = NotFound desc = could not find container \"4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488\": container with ID starting with 4ef45e06616e932466d174e49dba6ae2e13df27eaee46a924c87fda754186488 not found: ID does not exist" Apr 25 00:37:58.086152 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.086106 2576 scope.go:117] "RemoveContainer" containerID="6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0" Apr 25 00:37:58.086390 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:37:58.086369 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0\": container with ID starting with 6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0 not found: ID does not exist" containerID="6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0" Apr 25 00:37:58.086434 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.086399 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-66877cc755-gpc2d"] Apr 25 00:37:58.086476 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.086405 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0"} err="failed to get container status \"6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0\": rpc error: code = NotFound desc = could not find container \"6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0\": container with ID starting with 6caed4d709e9d61f084ef55ac34622da1b867053fe9072560de543aadb775aa0 not found: ID does not exist" Apr 25 00:37:58.086476 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.086446 2576 scope.go:117] "RemoveContainer" containerID="4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555" Apr 25 00:37:58.086700 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:37:58.086683 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555\": container with ID starting with 4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555 not found: ID does not exist" containerID="4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555" Apr 25 00:37:58.086758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.086703 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555"} err="failed to get container status \"4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555\": rpc error: code = NotFound desc = could not find container \"4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555\": container with ID starting with 4649cdf09159352919ddc4414471794d5a0c502eb8465465660744d1f84e4555 not found: ID does not exist" Apr 25 00:37:58.315622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:58.315553 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" path="/var/lib/kubelet/pods/78f608da-2859-43f9-9a66-4d094bb5ee48/volumes" Apr 25 00:37:59.065313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:59.065278 2576 generic.go:358] "Generic (PLEG): container finished" podID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerID="9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6" exitCode=0 Apr 25 00:37:59.065510 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:37:59.065361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerDied","Data":"9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6"} Apr 25 00:38:00.070436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:00.070397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerStarted","Data":"5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96"} Apr 25 00:38:00.070436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:00.070440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerStarted","Data":"a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812"} Apr 25 00:38:00.070989 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:00.070663 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:38:00.090320 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:00.090278 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" podStartSLOduration=7.090266352 podStartE2EDuration="7.090266352s" podCreationTimestamp="2026-04-25 00:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:38:00.088543578 +0000 UTC m=+2640.369660511" watchObservedRunningTime="2026-04-25 00:38:00.090266352 +0000 UTC m=+2640.371383271" Apr 25 00:38:01.073383 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:01.073351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:38:07.082034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:07.082003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:38:37.085852 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:37.085810 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 25 00:38:47.085050 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:47.085015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:38:53.528781 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.528743 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2"] Apr 25 00:38:53.529288 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.529255 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kube-rbac-proxy" containerID="cri-o://5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96" gracePeriod=30 Apr 25 00:38:53.529391 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.529255 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kserve-container" containerID="cri-o://a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812" gracePeriod=30 Apr 25 00:38:53.625855 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.625816 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk"] Apr 25 00:38:53.626133 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626118 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="storage-initializer" Apr 25 00:38:53.626197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626135 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="storage-initializer" Apr 25 00:38:53.626197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626154 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" Apr 25 00:38:53.626197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626162 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" Apr 25 00:38:53.626197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626175 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kube-rbac-proxy" Apr 25 00:38:53.626197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626184 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kube-rbac-proxy" Apr 25 00:38:53.626370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626228 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kserve-container" Apr 25 00:38:53.626370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.626238 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="78f608da-2859-43f9-9a66-4d094bb5ee48" containerName="kube-rbac-proxy" Apr 25 00:38:53.629377 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.629361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.631905 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.631882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:38:53.632003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.631889 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 25 00:38:53.638377 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.638355 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk"] Apr 25 00:38:53.722309 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.722278 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.722309 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.722324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.722535 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.722432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.722535 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.722486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9bj7\" (UniqueName: \"kubernetes.io/projected/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kube-api-access-h9bj7\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.823863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.823751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9bj7\" (UniqueName: \"kubernetes.io/projected/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kube-api-access-h9bj7\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.823863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.823817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.823863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.823845 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.824211 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.823889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.824211 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:38:53.824010 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-serving-cert: secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 25 00:38:53.824211 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:38:53.824074 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls podName:8f53103f-b3ff-4b31-a09f-b9aa912df6a9 nodeName:}" failed. No retries permitted until 2026-04-25 00:38:54.324053926 +0000 UTC m=+2694.605170826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls") pod "isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" (UID: "8f53103f-b3ff-4b31-a09f-b9aa912df6a9") : secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 25 00:38:53.824402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.824285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.824561 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.824542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:53.832876 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:53.832854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9bj7\" (UniqueName: \"kubernetes.io/projected/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kube-api-access-h9bj7\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:54.224744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:54.224661 2576 generic.go:358] "Generic (PLEG): container finished" podID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerID="5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96" exitCode=2 Apr 25 00:38:54.224744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:54.224705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerDied","Data":"5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96"} Apr 25 00:38:54.327549 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:54.327518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:54.329984 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:54.329962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:54.540487 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:54.540450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:38:54.659588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:54.659563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk"] Apr 25 00:38:54.661582 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:38:54.661556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f53103f_b3ff_4b31_a09f_b9aa912df6a9.slice/crio-377bdae4fbb73273503123f1e07fa2a4817cd732a33ad5920644f64db743caa8 WatchSource:0}: Error finding container 377bdae4fbb73273503123f1e07fa2a4817cd732a33ad5920644f64db743caa8: Status 404 returned error can't find the container with id 377bdae4fbb73273503123f1e07fa2a4817cd732a33ad5920644f64db743caa8 Apr 25 00:38:55.234275 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:55.234241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerStarted","Data":"37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b"} Apr 25 00:38:55.234275 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:55.234278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerStarted","Data":"377bdae4fbb73273503123f1e07fa2a4817cd732a33ad5920644f64db743caa8"} Apr 25 00:38:57.077399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:57.077353 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 25 00:38:57.082876 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:38:57.082845 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 25 00:39:00.250112 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.250020 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerID="37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b" exitCode=0 Apr 25 00:39:00.250112 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.250092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerDied","Data":"37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b"} Apr 25 00:39:00.656690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.656669 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:39:00.781199 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.781171 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls\") pod \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " Apr 25 00:39:00.781399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.781234 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " Apr 25 00:39:00.781399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.781292 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kserve-provision-location\") pod \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " Apr 25 00:39:00.781399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.781332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qr4k\" (UniqueName: \"kubernetes.io/projected/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kube-api-access-7qr4k\") pod \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\" (UID: \"bb8a02cc-6671-4309-ac74-7025b0c6eb8b\") " Apr 25 00:39:00.781637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.781617 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb8a02cc-6671-4309-ac74-7025b0c6eb8b" (UID: "bb8a02cc-6671-4309-ac74-7025b0c6eb8b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:39:00.781721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.781672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "bb8a02cc-6671-4309-ac74-7025b0c6eb8b" (UID: "bb8a02cc-6671-4309-ac74-7025b0c6eb8b"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:39:00.783525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.783493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bb8a02cc-6671-4309-ac74-7025b0c6eb8b" (UID: "bb8a02cc-6671-4309-ac74-7025b0c6eb8b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:39:00.783611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.783562 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kube-api-access-7qr4k" (OuterVolumeSpecName: "kube-api-access-7qr4k") pod "bb8a02cc-6671-4309-ac74-7025b0c6eb8b" (UID: "bb8a02cc-6671-4309-ac74-7025b0c6eb8b"). InnerVolumeSpecName "kube-api-access-7qr4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:39:00.882259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.882187 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qr4k\" (UniqueName: \"kubernetes.io/projected/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kube-api-access-7qr4k\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:00.882259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.882212 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:00.882259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.882226 2576 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:00.882259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:00.882239 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb8a02cc-6671-4309-ac74-7025b0c6eb8b-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:01.254791 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.254697 2576 generic.go:358] "Generic (PLEG): container finished" podID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerID="a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812" exitCode=0 Apr 25 00:39:01.255261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.254801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerDied","Data":"a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812"} Apr 25 00:39:01.255261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.254835 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" event={"ID":"bb8a02cc-6671-4309-ac74-7025b0c6eb8b","Type":"ContainerDied","Data":"103446a4dc44855f448cb40934e2e672630581359ebd3a472b50ec832400c1e2"} Apr 25 00:39:01.255261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.254858 2576 scope.go:117] "RemoveContainer" containerID="5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96" Apr 25 00:39:01.255261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.254888 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2" Apr 25 00:39:01.256987 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.256958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerStarted","Data":"241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62"} Apr 25 00:39:01.257098 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.257003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerStarted","Data":"ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285"} Apr 25 00:39:01.257278 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.257263 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:39:01.263546 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.263533 2576 scope.go:117] "RemoveContainer" containerID="a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812" Apr 25 00:39:01.270414 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.270392 2576 scope.go:117] "RemoveContainer" containerID="9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6" Apr 25 00:39:01.278887 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.278842 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" podStartSLOduration=8.278828215 podStartE2EDuration="8.278828215s" podCreationTimestamp="2026-04-25 00:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:39:01.277839332 +0000 UTC m=+2701.558956251" watchObservedRunningTime="2026-04-25 00:39:01.278828215 +0000 UTC m=+2701.559945137" Apr 25 00:39:01.278887 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.278877 2576 scope.go:117] "RemoveContainer" containerID="5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96" Apr 25 00:39:01.279141 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:39:01.279118 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96\": container with ID starting with 5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96 not found: ID does not exist" containerID="5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96" Apr 25 00:39:01.279220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.279149 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96"} err="failed to get container status \"5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96\": rpc error: code = NotFound desc = could not find container \"5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96\": container with ID starting with 5a997cedc34ffc3dd8ac0996134a8084102367ba748160e9b4bd78e293100a96 not found: ID does not exist" Apr 25 00:39:01.279220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.279173 2576 scope.go:117] "RemoveContainer" containerID="a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812" Apr 25 00:39:01.279404 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:39:01.279387 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812\": container with ID starting with a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812 not found: ID does not exist" containerID="a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812" Apr 25 00:39:01.279473 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.279421 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812"} err="failed to get container status \"a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812\": rpc error: code = NotFound desc = could not find container \"a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812\": container with ID starting with a930d0bd4aa4a50b46362d6c2a3dd25652a3da48456734db096037da00295812 not found: ID does not exist" Apr 25 00:39:01.279473 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.279436 2576 scope.go:117] "RemoveContainer" containerID="9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6" Apr 25 00:39:01.279696 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:39:01.279679 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6\": container with ID starting with 9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6 not found: ID does not exist" containerID="9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6" Apr 25 00:39:01.279770 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.279698 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6"} err="failed to get container status \"9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6\": rpc error: code = NotFound desc = could not find container \"9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6\": container with ID starting with 9473f7511a0ef41cb1e60465be5946a0cc29f0237a0475f28247eb137fdbcbb6 not found: ID does not exist" Apr 25 00:39:01.290721 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.290697 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2"] Apr 25 00:39:01.297715 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:01.297681 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sdbl2"] Apr 25 00:39:02.261524 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:02.261483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:39:02.262471 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:02.262444 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 25 00:39:02.317690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:02.317653 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" path="/var/lib/kubelet/pods/bb8a02cc-6671-4309-ac74-7025b0c6eb8b/volumes" Apr 25 00:39:03.263937 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:03.263891 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 25 00:39:08.268587 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:08.268557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:39:08.269146 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:08.269118 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 25 00:39:18.270075 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:18.270036 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:39:30.541908 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.541859 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk_8f53103f-b3ff-4b31-a09f-b9aa912df6a9/kserve-container/0.log" Apr 25 00:39:30.666247 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.666213 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk"] Apr 25 00:39:30.666651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.666623 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" containerID="cri-o://ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285" gracePeriod=30 Apr 25 00:39:30.666757 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.666655 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kube-rbac-proxy" containerID="cri-o://241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62" gracePeriod=30 Apr 25 00:39:30.730403 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730370 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg"] Apr 25 00:39:30.730660 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730649 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="storage-initializer" Apr 25 00:39:30.730706 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730662 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="storage-initializer" Apr 25 00:39:30.730706 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730671 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kserve-container" Apr 25 00:39:30.730706 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730676 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kserve-container" Apr 25 00:39:30.730706 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730684 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kube-rbac-proxy" Apr 25 00:39:30.730706 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730689 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kube-rbac-proxy" Apr 25 00:39:30.730863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730739 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kube-rbac-proxy" Apr 25 00:39:30.730863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.730748 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb8a02cc-6671-4309-ac74-7025b0c6eb8b" containerName="kserve-container" Apr 25 00:39:30.733755 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.733736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.736075 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.736048 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 25 00:39:30.736199 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.736059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:39:30.744329 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.744309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg"] Apr 25 00:39:30.798184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.798139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f935536-443a-4832-9420-f06754a917c8-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.798184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.798176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f935536-443a-4832-9420-f06754a917c8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.798350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.798196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjng\" (UniqueName: \"kubernetes.io/projected/4f935536-443a-4832-9420-f06754a917c8-kube-api-access-kpjng\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.798350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.798299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f935536-443a-4832-9420-f06754a917c8-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.899497 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.899467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f935536-443a-4832-9420-f06754a917c8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.899653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.899502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjng\" (UniqueName: \"kubernetes.io/projected/4f935536-443a-4832-9420-f06754a917c8-kube-api-access-kpjng\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.899653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.899527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f935536-443a-4832-9420-f06754a917c8-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.899653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.899580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f935536-443a-4832-9420-f06754a917c8-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.900042 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.900016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f935536-443a-4832-9420-f06754a917c8-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.900231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.900210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f935536-443a-4832-9420-f06754a917c8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.901959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.901942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f935536-443a-4832-9420-f06754a917c8-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:30.910971 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:30.910946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjng\" (UniqueName: \"kubernetes.io/projected/4f935536-443a-4832-9420-f06754a917c8-kube-api-access-kpjng\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:31.046087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.046034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:31.171026 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.170998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg"] Apr 25 00:39:31.174183 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:39:31.174159 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f935536_443a_4832_9420_f06754a917c8.slice/crio-bfb429cdc66061427166aedfcb01d4880a100a8c8406ae589935721e1fb840eb WatchSource:0}: Error finding container bfb429cdc66061427166aedfcb01d4880a100a8c8406ae589935721e1fb840eb: Status 404 returned error can't find the container with id bfb429cdc66061427166aedfcb01d4880a100a8c8406ae589935721e1fb840eb Apr 25 00:39:31.341415 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.341383 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerID="241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62" exitCode=2 Apr 25 00:39:31.341566 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.341461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerDied","Data":"241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62"} Apr 25 00:39:31.342835 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.342802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerStarted","Data":"ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a"} Apr 25 00:39:31.343001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.342836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerStarted","Data":"bfb429cdc66061427166aedfcb01d4880a100a8c8406ae589935721e1fb840eb"} Apr 25 00:39:31.491111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.491088 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:39:31.504650 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.504629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls\") pod \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " Apr 25 00:39:31.504738 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.504695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9bj7\" (UniqueName: \"kubernetes.io/projected/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kube-api-access-h9bj7\") pod \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " Apr 25 00:39:31.504795 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.504756 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kserve-provision-location\") pod \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " Apr 25 00:39:31.504795 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.504787 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\" (UID: \"8f53103f-b3ff-4b31-a09f-b9aa912df6a9\") " Apr 25 00:39:31.505246 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.505202 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "8f53103f-b3ff-4b31-a09f-b9aa912df6a9" (UID: "8f53103f-b3ff-4b31-a09f-b9aa912df6a9"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:39:31.506936 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.506890 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f53103f-b3ff-4b31-a09f-b9aa912df6a9" (UID: "8f53103f-b3ff-4b31-a09f-b9aa912df6a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:39:31.506936 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.506908 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kube-api-access-h9bj7" (OuterVolumeSpecName: "kube-api-access-h9bj7") pod "8f53103f-b3ff-4b31-a09f-b9aa912df6a9" (UID: "8f53103f-b3ff-4b31-a09f-b9aa912df6a9"). InnerVolumeSpecName "kube-api-access-h9bj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:39:31.531700 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.531647 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f53103f-b3ff-4b31-a09f-b9aa912df6a9" (UID: "8f53103f-b3ff-4b31-a09f-b9aa912df6a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:39:31.605974 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.605885 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9bj7\" (UniqueName: \"kubernetes.io/projected/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kube-api-access-h9bj7\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:31.605974 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.605932 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:31.605974 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.605944 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:31.605974 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:31.605957 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f53103f-b3ff-4b31-a09f-b9aa912df6a9-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:39:32.347354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.347323 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerID="ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285" exitCode=0 Apr 25 00:39:32.347496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.347391 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" Apr 25 00:39:32.347496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.347408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerDied","Data":"ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285"} Apr 25 00:39:32.347496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.347449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk" event={"ID":"8f53103f-b3ff-4b31-a09f-b9aa912df6a9","Type":"ContainerDied","Data":"377bdae4fbb73273503123f1e07fa2a4817cd732a33ad5920644f64db743caa8"} Apr 25 00:39:32.347496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.347469 2576 scope.go:117] "RemoveContainer" containerID="241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62" Apr 25 00:39:32.355362 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.355346 2576 scope.go:117] "RemoveContainer" containerID="ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285" Apr 25 00:39:32.362673 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.362650 2576 scope.go:117] "RemoveContainer" containerID="37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b" Apr 25 00:39:32.363620 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.363601 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk"] Apr 25 00:39:32.367203 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.367182 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-75fc9fb85c-4kmkk"] Apr 25 00:39:32.369749 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.369731 2576 scope.go:117] "RemoveContainer" containerID="241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62" Apr 25 00:39:32.370028 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:39:32.370011 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62\": container with ID starting with 241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62 not found: ID does not exist" containerID="241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62" Apr 25 00:39:32.370085 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.370036 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62"} err="failed to get container status \"241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62\": rpc error: code = NotFound desc = could not find container \"241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62\": container with ID starting with 241d7c1712ff82172de85bbec0875aeeb0d0580263b60a1f3424916b4ef69d62 not found: ID does not exist" Apr 25 00:39:32.370085 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.370061 2576 scope.go:117] "RemoveContainer" containerID="ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285" Apr 25 00:39:32.370291 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:39:32.370276 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285\": container with ID starting with ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285 not found: ID does not exist" containerID="ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285" Apr 25 00:39:32.370352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.370294 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285"} err="failed to get container status \"ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285\": rpc error: code = NotFound desc = could not find container \"ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285\": container with ID starting with ef2d4ccf0aa74f6a15953931a7109a107787581e026f477e1bf4aafcde8c6285 not found: ID does not exist" Apr 25 00:39:32.370352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.370308 2576 scope.go:117] "RemoveContainer" containerID="37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b" Apr 25 00:39:32.370538 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:39:32.370522 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b\": container with ID starting with 37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b not found: ID does not exist" containerID="37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b" Apr 25 00:39:32.370583 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:32.370540 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b"} err="failed to get container status \"37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b\": rpc error: code = NotFound desc = could not find container \"37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b\": container with ID starting with 37a7bc86b900f450ab829a26fed40f3a7e84d1f07fbc6b26d3eef40050e3736b not found: ID does not exist" Apr 25 00:39:34.316002 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:34.315968 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" path="/var/lib/kubelet/pods/8f53103f-b3ff-4b31-a09f-b9aa912df6a9/volumes" Apr 25 00:39:35.359845 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:35.359813 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f935536-443a-4832-9420-f06754a917c8" containerID="ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a" exitCode=0 Apr 25 00:39:35.360232 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:35.359856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerDied","Data":"ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a"} Apr 25 00:39:36.363850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:36.363816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerStarted","Data":"3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9"} Apr 25 00:39:36.363850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:36.363850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerStarted","Data":"b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f"} Apr 25 00:39:36.364286 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:36.364199 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:36.364286 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:36.364230 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:39:36.382625 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:36.382506 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podStartSLOduration=6.38249468 podStartE2EDuration="6.38249468s" podCreationTimestamp="2026-04-25 00:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:39:36.381563362 +0000 UTC m=+2736.662680281" watchObservedRunningTime="2026-04-25 00:39:36.38249468 +0000 UTC m=+2736.663611601" Apr 25 00:39:37.130445 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:37.130417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:39:37.134881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:37.134863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:39:42.372825 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:39:42.372796 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:40:12.379643 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:12.379559 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 25 00:40:22.375429 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:22.375397 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:40:30.824935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.824889 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg"] Apr 25 00:40:30.825374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.825300 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kserve-container" containerID="cri-o://b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f" gracePeriod=30 Apr 25 00:40:30.825432 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.825339 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kube-rbac-proxy" containerID="cri-o://3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9" gracePeriod=30 Apr 25 00:40:30.895214 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895183 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh"] Apr 25 00:40:30.895457 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="storage-initializer" Apr 25 00:40:30.895502 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895459 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="storage-initializer" Apr 25 00:40:30.895502 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895467 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" Apr 25 00:40:30.895502 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895472 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" Apr 25 00:40:30.895502 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895485 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kube-rbac-proxy" Apr 25 00:40:30.895502 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895493 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kube-rbac-proxy" Apr 25 00:40:30.895651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895533 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kserve-container" Apr 25 00:40:30.895651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.895545 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f53103f-b3ff-4b31-a09f-b9aa912df6a9" containerName="kube-rbac-proxy" Apr 25 00:40:30.898559 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.898541 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:30.900877 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.900856 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 25 00:40:30.900986 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.900859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 25 00:40:30.906962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:30.906895 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh"] Apr 25 00:40:31.037936 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.037880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5drtw\" (UniqueName: \"kubernetes.io/projected/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kube-api-access-5drtw\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.038104 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.037951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.038104 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.037977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.038104 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.038013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/075b7956-5aff-4b1b-8ca8-2c7c4681091a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.139380 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.139307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/075b7956-5aff-4b1b-8ca8-2c7c4681091a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.139380 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.139349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5drtw\" (UniqueName: \"kubernetes.io/projected/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kube-api-access-5drtw\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.139564 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.139387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.139564 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.139424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.139564 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:40:31.139555 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 25 00:40:31.139670 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:40:31.139630 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls podName:075b7956-5aff-4b1b-8ca8-2c7c4681091a nodeName:}" failed. No retries permitted until 2026-04-25 00:40:31.63960746 +0000 UTC m=+2791.920724358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls") pod "isvc-sklearn-v2-predictor-54df6cf899-b6dkh" (UID: "075b7956-5aff-4b1b-8ca8-2c7c4681091a") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 25 00:40:31.139714 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.139697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.140079 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.140062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/075b7956-5aff-4b1b-8ca8-2c7c4681091a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.148653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.148626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5drtw\" (UniqueName: \"kubernetes.io/projected/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kube-api-access-5drtw\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.518554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.518521 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f935536-443a-4832-9420-f06754a917c8" containerID="3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9" exitCode=2 Apr 25 00:40:31.518711 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.518601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerDied","Data":"3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9"} Apr 25 00:40:31.643235 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.643199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.645817 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.645787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls\") pod \"isvc-sklearn-v2-predictor-54df6cf899-b6dkh\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.809064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.808983 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:31.928853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:31.928816 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh"] Apr 25 00:40:31.932851 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:40:31.932819 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075b7956_5aff_4b1b_8ca8_2c7c4681091a.slice/crio-939f568a22baf155c0f82042d3a37a48f45e15b42d1e954d29614bf0b61f1ff7 WatchSource:0}: Error finding container 939f568a22baf155c0f82042d3a37a48f45e15b42d1e954d29614bf0b61f1ff7: Status 404 returned error can't find the container with id 939f568a22baf155c0f82042d3a37a48f45e15b42d1e954d29614bf0b61f1ff7 Apr 25 00:40:32.368269 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:32.368223 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 25 00:40:32.523142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:32.523106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerStarted","Data":"40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d"} Apr 25 00:40:32.523142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:32.523141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerStarted","Data":"939f568a22baf155c0f82042d3a37a48f45e15b42d1e954d29614bf0b61f1ff7"} Apr 25 00:40:33.415094 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:33.415046 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.50:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 25 00:40:36.536632 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:36.536599 2576 generic.go:358] "Generic (PLEG): container finished" podID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerID="40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d" exitCode=0 Apr 25 00:40:36.537036 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:36.536652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerDied","Data":"40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d"} Apr 25 00:40:37.368307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.368264 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 25 00:40:37.541956 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.541901 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerStarted","Data":"89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676"} Apr 25 00:40:37.541956 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.541959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerStarted","Data":"5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c"} Apr 25 00:40:37.542374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.542171 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:37.560603 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.560558 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podStartSLOduration=7.560542869 podStartE2EDuration="7.560542869s" podCreationTimestamp="2026-04-25 00:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:40:37.558808643 +0000 UTC m=+2797.839925562" watchObservedRunningTime="2026-04-25 00:40:37.560542869 +0000 UTC m=+2797.841659789" Apr 25 00:40:37.768223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.768194 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:40:37.890287 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.890259 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f935536-443a-4832-9420-f06754a917c8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"4f935536-443a-4832-9420-f06754a917c8\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " Apr 25 00:40:37.890452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.890292 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f935536-443a-4832-9420-f06754a917c8-proxy-tls\") pod \"4f935536-443a-4832-9420-f06754a917c8\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " Apr 25 00:40:37.890452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.890316 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f935536-443a-4832-9420-f06754a917c8-kserve-provision-location\") pod \"4f935536-443a-4832-9420-f06754a917c8\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " Apr 25 00:40:37.890452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.890355 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpjng\" (UniqueName: \"kubernetes.io/projected/4f935536-443a-4832-9420-f06754a917c8-kube-api-access-kpjng\") pod \"4f935536-443a-4832-9420-f06754a917c8\" (UID: \"4f935536-443a-4832-9420-f06754a917c8\") " Apr 25 00:40:37.890684 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.890657 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f935536-443a-4832-9420-f06754a917c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f935536-443a-4832-9420-f06754a917c8" (UID: "4f935536-443a-4832-9420-f06754a917c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:40:37.890739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.890659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f935536-443a-4832-9420-f06754a917c8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "4f935536-443a-4832-9420-f06754a917c8" (UID: "4f935536-443a-4832-9420-f06754a917c8"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:40:37.892474 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.892451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f935536-443a-4832-9420-f06754a917c8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4f935536-443a-4832-9420-f06754a917c8" (UID: "4f935536-443a-4832-9420-f06754a917c8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:40:37.892548 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.892531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f935536-443a-4832-9420-f06754a917c8-kube-api-access-kpjng" (OuterVolumeSpecName: "kube-api-access-kpjng") pod "4f935536-443a-4832-9420-f06754a917c8" (UID: "4f935536-443a-4832-9420-f06754a917c8"). InnerVolumeSpecName "kube-api-access-kpjng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:40:37.990989 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.990961 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpjng\" (UniqueName: \"kubernetes.io/projected/4f935536-443a-4832-9420-f06754a917c8-kube-api-access-kpjng\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:40:37.990989 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.990983 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f935536-443a-4832-9420-f06754a917c8-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:40:37.990989 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.990993 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f935536-443a-4832-9420-f06754a917c8-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:40:37.991194 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:37.991003 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f935536-443a-4832-9420-f06754a917c8-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:40:38.546707 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.546674 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f935536-443a-4832-9420-f06754a917c8" containerID="b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f" exitCode=0 Apr 25 00:40:38.547213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.546748 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" Apr 25 00:40:38.547213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.546756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerDied","Data":"b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f"} Apr 25 00:40:38.547213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.546791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg" event={"ID":"4f935536-443a-4832-9420-f06754a917c8","Type":"ContainerDied","Data":"bfb429cdc66061427166aedfcb01d4880a100a8c8406ae589935721e1fb840eb"} Apr 25 00:40:38.547213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.546814 2576 scope.go:117] "RemoveContainer" containerID="3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9" Apr 25 00:40:38.547438 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.547381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:38.548932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.548881 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:40:38.554783 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.554766 2576 scope.go:117] "RemoveContainer" containerID="b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f" Apr 25 00:40:38.562160 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.562137 2576 scope.go:117] "RemoveContainer" containerID="ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a" Apr 25 00:40:38.563504 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.563485 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg"] Apr 25 00:40:38.566954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.566932 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-dz6rg"] Apr 25 00:40:38.569853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.569836 2576 scope.go:117] "RemoveContainer" containerID="3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9" Apr 25 00:40:38.570152 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:40:38.570134 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9\": container with ID starting with 3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9 not found: ID does not exist" containerID="3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9" Apr 25 00:40:38.570213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.570169 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9"} err="failed to get container status \"3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9\": rpc error: code = NotFound desc = could not find container \"3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9\": container with ID starting with 3246b42d1ba251cc660bab7019ebe69b820605ed639eed28ad48f9003a4488b9 not found: ID does not exist" Apr 25 00:40:38.570213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.570190 2576 scope.go:117] "RemoveContainer" containerID="b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f" Apr 25 00:40:38.570457 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:40:38.570439 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f\": container with ID starting with b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f not found: ID does not exist" containerID="b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f" Apr 25 00:40:38.570491 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.570465 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f"} err="failed to get container status \"b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f\": rpc error: code = NotFound desc = could not find container \"b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f\": container with ID starting with b9419fded361108683355aea01ee1b906da21d58feeac35e5e488b90eae11c4f not found: ID does not exist" Apr 25 00:40:38.570491 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.570483 2576 scope.go:117] "RemoveContainer" containerID="ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a" Apr 25 00:40:38.570716 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:40:38.570690 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a\": container with ID starting with ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a not found: ID does not exist" containerID="ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a" Apr 25 00:40:38.570753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:38.570722 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a"} err="failed to get container status \"ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a\": rpc error: code = NotFound desc = could not find container \"ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a\": container with ID starting with ddea857e535fac809ecc98dd30f62e2e80500ef630fb1b41229c96d360c1ed2a not found: ID does not exist" Apr 25 00:40:39.550874 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:39.550835 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:40:40.315688 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:40.315654 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f935536-443a-4832-9420-f06754a917c8" path="/var/lib/kubelet/pods/4f935536-443a-4832-9420-f06754a917c8/volumes" Apr 25 00:40:44.555200 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:44.555170 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:40:44.555695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:44.555667 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:40:54.556074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:40:54.556034 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:41:04.556211 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:04.556168 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:41:14.555779 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:14.555737 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:41:24.556108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:24.556068 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:41:34.556349 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:34.556307 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:41:44.556499 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:44.556470 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:41:51.050784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.050750 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh"] Apr 25 00:41:51.051204 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.051078 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" containerID="cri-o://5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c" gracePeriod=30 Apr 25 00:41:51.051204 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.051142 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kube-rbac-proxy" containerID="cri-o://89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676" gracePeriod=30 Apr 25 00:41:51.149299 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149261 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv"] Apr 25 00:41:51.149595 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149582 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kserve-container" Apr 25 00:41:51.149641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149598 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kserve-container" Apr 25 00:41:51.149641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149608 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kube-rbac-proxy" Apr 25 00:41:51.149641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149614 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kube-rbac-proxy" Apr 25 00:41:51.149641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149622 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="storage-initializer" Apr 25 00:41:51.149641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149627 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="storage-initializer" Apr 25 00:41:51.149794 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149685 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kube-rbac-proxy" Apr 25 00:41:51.149794 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.149694 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f935536-443a-4832-9420-f06754a917c8" containerName="kserve-container" Apr 25 00:41:51.152876 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.152856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.155385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.155362 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 25 00:41:51.155516 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.155399 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 25 00:41:51.162170 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.162149 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv"] Apr 25 00:41:51.173877 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.173857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnln2\" (UniqueName: \"kubernetes.io/projected/da551da4-85dd-4794-bb23-8cd99a952716-kube-api-access-gnln2\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.173994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.173889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da551da4-85dd-4794-bb23-8cd99a952716-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.173994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.173908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.174090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.174012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da551da4-85dd-4794-bb23-8cd99a952716-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.275064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.275031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da551da4-85dd-4794-bb23-8cd99a952716-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.275196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.275088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnln2\" (UniqueName: \"kubernetes.io/projected/da551da4-85dd-4794-bb23-8cd99a952716-kube-api-access-gnln2\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.275196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.275124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da551da4-85dd-4794-bb23-8cd99a952716-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.275196 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.275153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.275367 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:41:51.275284 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 25 00:41:51.275367 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:41:51.275350 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls podName:da551da4-85dd-4794-bb23-8cd99a952716 nodeName:}" failed. No retries permitted until 2026-04-25 00:41:51.775330046 +0000 UTC m=+2872.056446951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" (UID: "da551da4-85dd-4794-bb23-8cd99a952716") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 25 00:41:51.275553 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.275531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da551da4-85dd-4794-bb23-8cd99a952716-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.275692 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.275674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da551da4-85dd-4794-bb23-8cd99a952716-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.283596 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.283576 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnln2\" (UniqueName: \"kubernetes.io/projected/da551da4-85dd-4794-bb23-8cd99a952716-kube-api-access-gnln2\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.755232 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.755200 2576 generic.go:358] "Generic (PLEG): container finished" podID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerID="89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676" exitCode=2 Apr 25 00:41:51.755413 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.755238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerDied","Data":"89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676"} Apr 25 00:41:51.777712 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.777679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:51.780269 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:51.780241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:52.065498 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:52.065469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:52.187475 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:52.187299 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv"] Apr 25 00:41:52.190010 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:41:52.189968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda551da4_85dd_4794_bb23_8cd99a952716.slice/crio-5f10c22971042fc9a35794ecc769eb418058d522611f940f7bc98f05d89edff8 WatchSource:0}: Error finding container 5f10c22971042fc9a35794ecc769eb418058d522611f940f7bc98f05d89edff8: Status 404 returned error can't find the container with id 5f10c22971042fc9a35794ecc769eb418058d522611f940f7bc98f05d89edff8 Apr 25 00:41:52.759143 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:52.759110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerStarted","Data":"abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5"} Apr 25 00:41:52.759143 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:52.759149 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerStarted","Data":"5f10c22971042fc9a35794ecc769eb418058d522611f940f7bc98f05d89edff8"} Apr 25 00:41:54.551412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:54.551365 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 25 00:41:54.555659 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:54.555632 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 25 00:41:55.081003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.080982 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:41:55.104765 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.104739 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls\") pod \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " Apr 25 00:41:55.104899 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.104792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5drtw\" (UniqueName: \"kubernetes.io/projected/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kube-api-access-5drtw\") pod \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " Apr 25 00:41:55.104899 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.104819 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kserve-provision-location\") pod \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " Apr 25 00:41:55.104899 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.104851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/075b7956-5aff-4b1b-8ca8-2c7c4681091a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\" (UID: \"075b7956-5aff-4b1b-8ca8-2c7c4681091a\") " Apr 25 00:41:55.105165 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.105140 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "075b7956-5aff-4b1b-8ca8-2c7c4681091a" (UID: "075b7956-5aff-4b1b-8ca8-2c7c4681091a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:41:55.105299 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.105266 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075b7956-5aff-4b1b-8ca8-2c7c4681091a-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "075b7956-5aff-4b1b-8ca8-2c7c4681091a" (UID: "075b7956-5aff-4b1b-8ca8-2c7c4681091a"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:41:55.106957 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.106932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "075b7956-5aff-4b1b-8ca8-2c7c4681091a" (UID: "075b7956-5aff-4b1b-8ca8-2c7c4681091a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:41:55.107053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.107029 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kube-api-access-5drtw" (OuterVolumeSpecName: "kube-api-access-5drtw") pod "075b7956-5aff-4b1b-8ca8-2c7c4681091a" (UID: "075b7956-5aff-4b1b-8ca8-2c7c4681091a"). InnerVolumeSpecName "kube-api-access-5drtw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:41:55.206322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.206255 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/075b7956-5aff-4b1b-8ca8-2c7c4681091a-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:41:55.206322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.206286 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5drtw\" (UniqueName: \"kubernetes.io/projected/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kube-api-access-5drtw\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:41:55.206322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.206297 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/075b7956-5aff-4b1b-8ca8-2c7c4681091a-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:41:55.206322 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.206309 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/075b7956-5aff-4b1b-8ca8-2c7c4681091a-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:41:55.768344 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.768309 2576 generic.go:358] "Generic (PLEG): container finished" podID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerID="5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c" exitCode=0 Apr 25 00:41:55.768788 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.768390 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" Apr 25 00:41:55.768788 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.768400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerDied","Data":"5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c"} Apr 25 00:41:55.768788 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.768451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh" event={"ID":"075b7956-5aff-4b1b-8ca8-2c7c4681091a","Type":"ContainerDied","Data":"939f568a22baf155c0f82042d3a37a48f45e15b42d1e954d29614bf0b61f1ff7"} Apr 25 00:41:55.768788 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.768476 2576 scope.go:117] "RemoveContainer" containerID="89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676" Apr 25 00:41:55.776895 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.776878 2576 scope.go:117] "RemoveContainer" containerID="5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c" Apr 25 00:41:55.783689 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.783674 2576 scope.go:117] "RemoveContainer" containerID="40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d" Apr 25 00:41:55.790710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.790686 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh"] Apr 25 00:41:55.791281 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.791264 2576 scope.go:117] "RemoveContainer" containerID="89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676" Apr 25 00:41:55.792299 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:41:55.792259 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676\": container with ID starting with 89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676 not found: ID does not exist" containerID="89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676" Apr 25 00:41:55.792402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.792306 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676"} err="failed to get container status \"89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676\": rpc error: code = NotFound desc = could not find container \"89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676\": container with ID starting with 89f7338806f04ed38c4c9ea632ec86f9666890af803c21447b350a0cd5642676 not found: ID does not exist" Apr 25 00:41:55.792402 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.792324 2576 scope.go:117] "RemoveContainer" containerID="5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c" Apr 25 00:41:55.792621 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:41:55.792591 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c\": container with ID starting with 5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c not found: ID does not exist" containerID="5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c" Apr 25 00:41:55.792704 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.792628 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c"} err="failed to get container status \"5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c\": rpc error: code = NotFound desc = could not find container \"5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c\": container with ID starting with 5c167cd9965b5bd0000a7be01e2f5b95fba1dcb7194ae564405914d3818bda6c not found: ID does not exist" Apr 25 00:41:55.792704 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.792644 2576 scope.go:117] "RemoveContainer" containerID="40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d" Apr 25 00:41:55.792887 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:41:55.792871 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d\": container with ID starting with 40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d not found: ID does not exist" containerID="40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d" Apr 25 00:41:55.792969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.792891 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d"} err="failed to get container status \"40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d\": rpc error: code = NotFound desc = could not find container \"40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d\": container with ID starting with 40a896890c1d522a8f60aebc8ad3f7cf227e95837e208b2b4c0025455ee8ca9d not found: ID does not exist" Apr 25 00:41:55.793394 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:55.793373 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-54df6cf899-b6dkh"] Apr 25 00:41:56.315067 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:56.315028 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" path="/var/lib/kubelet/pods/075b7956-5aff-4b1b-8ca8-2c7c4681091a/volumes" Apr 25 00:41:56.772554 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:56.772516 2576 generic.go:358] "Generic (PLEG): container finished" podID="da551da4-85dd-4794-bb23-8cd99a952716" containerID="abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5" exitCode=0 Apr 25 00:41:56.773036 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:56.772595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerDied","Data":"abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5"} Apr 25 00:41:57.778342 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:57.778302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerStarted","Data":"ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2"} Apr 25 00:41:57.778733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:57.778350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerStarted","Data":"6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3"} Apr 25 00:41:57.778733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:57.778705 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:57.778847 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:57.778827 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:41:57.780309 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:57.780285 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:41:57.795413 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:57.795378 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podStartSLOduration=6.795367727 podStartE2EDuration="6.795367727s" podCreationTimestamp="2026-04-25 00:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:41:57.794199636 +0000 UTC m=+2878.075316569" watchObservedRunningTime="2026-04-25 00:41:57.795367727 +0000 UTC m=+2878.076484647" Apr 25 00:41:58.781707 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:41:58.781659 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:42:03.786288 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:03.786262 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:42:03.786910 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:03.786884 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:42:13.787550 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:13.787509 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:42:23.786837 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:23.786796 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:42:33.787193 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:33.787152 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:42:43.787452 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:43.787413 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:42:53.787732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:42:53.787694 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:43:03.788077 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:03.788049 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:43:11.244009 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.243909 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv"] Apr 25 00:43:11.244388 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.244247 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" containerID="cri-o://6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3" gracePeriod=30 Apr 25 00:43:11.244388 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.244304 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kube-rbac-proxy" containerID="cri-o://ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2" gracePeriod=30 Apr 25 00:43:11.319285 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319260 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq"] Apr 25 00:43:11.319563 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319550 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" Apr 25 00:43:11.319611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319567 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" Apr 25 00:43:11.319611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319578 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kube-rbac-proxy" Apr 25 00:43:11.319611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319584 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kube-rbac-proxy" Apr 25 00:43:11.319611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319590 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="storage-initializer" Apr 25 00:43:11.319611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319596 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="storage-initializer" Apr 25 00:43:11.319760 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319644 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kserve-container" Apr 25 00:43:11.319760 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.319654 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="075b7956-5aff-4b1b-8ca8-2c7c4681091a" containerName="kube-rbac-proxy" Apr 25 00:43:11.322584 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.322566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.325021 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.324978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 25 00:43:11.325127 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.325111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 25 00:43:11.332621 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.332600 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq"] Apr 25 00:43:11.439635 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.439606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2890b2b-4443-40a7-b918-3e97d3d46292-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.439748 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.439647 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2890b2b-4443-40a7-b918-3e97d3d46292-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.439748 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.439703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2890b2b-4443-40a7-b918-3e97d3d46292-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.439844 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.439803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zplzb\" (UniqueName: \"kubernetes.io/projected/a2890b2b-4443-40a7-b918-3e97d3d46292-kube-api-access-zplzb\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.541144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.541107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zplzb\" (UniqueName: \"kubernetes.io/projected/a2890b2b-4443-40a7-b918-3e97d3d46292-kube-api-access-zplzb\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.541331 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.541170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2890b2b-4443-40a7-b918-3e97d3d46292-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.541331 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.541194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2890b2b-4443-40a7-b918-3e97d3d46292-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.541331 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.541210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2890b2b-4443-40a7-b918-3e97d3d46292-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.541728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.541703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2890b2b-4443-40a7-b918-3e97d3d46292-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.541853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.541836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2890b2b-4443-40a7-b918-3e97d3d46292-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.543874 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.543855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2890b2b-4443-40a7-b918-3e97d3d46292-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.550809 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.550784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zplzb\" (UniqueName: \"kubernetes.io/projected/a2890b2b-4443-40a7-b918-3e97d3d46292-kube-api-access-zplzb\") pod \"isvc-tensorflow-predictor-6756f669d7-zg7cq\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.632899 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.632867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:11.757116 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.757086 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq"] Apr 25 00:43:11.759588 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:43:11.759551 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2890b2b_4443_40a7_b918_3e97d3d46292.slice/crio-de2efabd5c1882ab2e688e56622d7578ccd1c068fcd432bc1cd1a21f3c0a7e27 WatchSource:0}: Error finding container de2efabd5c1882ab2e688e56622d7578ccd1c068fcd432bc1cd1a21f3c0a7e27: Status 404 returned error can't find the container with id de2efabd5c1882ab2e688e56622d7578ccd1c068fcd432bc1cd1a21f3c0a7e27 Apr 25 00:43:11.761557 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.761541 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:43:11.978352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.978308 2576 generic.go:358] "Generic (PLEG): container finished" podID="da551da4-85dd-4794-bb23-8cd99a952716" containerID="ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2" exitCode=2 Apr 25 00:43:11.978352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.978328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerDied","Data":"ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2"} Apr 25 00:43:11.979927 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.979886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerStarted","Data":"e3f128b5ddaab18e47991a147ed2569f7eda6eb191d616719c762a5890581785"} Apr 25 00:43:11.980072 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:11.979938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerStarted","Data":"de2efabd5c1882ab2e688e56622d7578ccd1c068fcd432bc1cd1a21f3c0a7e27"} Apr 25 00:43:13.782797 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:13.782752 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 25 00:43:13.787144 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:13.787111 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 25 00:43:15.284489 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.284468 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:43:15.371326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.371251 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da551da4-85dd-4794-bb23-8cd99a952716-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"da551da4-85dd-4794-bb23-8cd99a952716\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " Apr 25 00:43:15.371326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.371287 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnln2\" (UniqueName: \"kubernetes.io/projected/da551da4-85dd-4794-bb23-8cd99a952716-kube-api-access-gnln2\") pod \"da551da4-85dd-4794-bb23-8cd99a952716\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " Apr 25 00:43:15.371326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.371326 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls\") pod \"da551da4-85dd-4794-bb23-8cd99a952716\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " Apr 25 00:43:15.371573 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.371387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da551da4-85dd-4794-bb23-8cd99a952716-kserve-provision-location\") pod \"da551da4-85dd-4794-bb23-8cd99a952716\" (UID: \"da551da4-85dd-4794-bb23-8cd99a952716\") " Apr 25 00:43:15.371633 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.371583 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da551da4-85dd-4794-bb23-8cd99a952716-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "da551da4-85dd-4794-bb23-8cd99a952716" (UID: "da551da4-85dd-4794-bb23-8cd99a952716"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:43:15.371790 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.371768 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da551da4-85dd-4794-bb23-8cd99a952716-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da551da4-85dd-4794-bb23-8cd99a952716" (UID: "da551da4-85dd-4794-bb23-8cd99a952716"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:43:15.373470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.373452 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da551da4-85dd-4794-bb23-8cd99a952716-kube-api-access-gnln2" (OuterVolumeSpecName: "kube-api-access-gnln2") pod "da551da4-85dd-4794-bb23-8cd99a952716" (UID: "da551da4-85dd-4794-bb23-8cd99a952716"). InnerVolumeSpecName "kube-api-access-gnln2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:43:15.373532 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.373451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da551da4-85dd-4794-bb23-8cd99a952716" (UID: "da551da4-85dd-4794-bb23-8cd99a952716"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:43:15.472078 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.472034 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da551da4-85dd-4794-bb23-8cd99a952716-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:43:15.472078 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.472067 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da551da4-85dd-4794-bb23-8cd99a952716-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:43:15.472350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.472083 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnln2\" (UniqueName: \"kubernetes.io/projected/da551da4-85dd-4794-bb23-8cd99a952716-kube-api-access-gnln2\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:43:15.472350 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.472098 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da551da4-85dd-4794-bb23-8cd99a952716-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:43:15.992653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.992616 2576 generic.go:358] "Generic (PLEG): container finished" podID="da551da4-85dd-4794-bb23-8cd99a952716" containerID="6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3" exitCode=0 Apr 25 00:43:15.992850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.992695 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" Apr 25 00:43:15.992850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.992691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerDied","Data":"6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3"} Apr 25 00:43:15.992850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.992736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv" event={"ID":"da551da4-85dd-4794-bb23-8cd99a952716","Type":"ContainerDied","Data":"5f10c22971042fc9a35794ecc769eb418058d522611f940f7bc98f05d89edff8"} Apr 25 00:43:15.992850 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:15.992756 2576 scope.go:117] "RemoveContainer" containerID="ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2" Apr 25 00:43:16.001250 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.001196 2576 scope.go:117] "RemoveContainer" containerID="6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3" Apr 25 00:43:16.009022 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.009003 2576 scope.go:117] "RemoveContainer" containerID="abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5" Apr 25 00:43:16.014772 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.014750 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv"] Apr 25 00:43:16.016621 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.016604 2576 scope.go:117] "RemoveContainer" containerID="ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2" Apr 25 00:43:16.016854 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:43:16.016836 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2\": container with ID starting with ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2 not found: ID does not exist" containerID="ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2" Apr 25 00:43:16.016957 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.016861 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2"} err="failed to get container status \"ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2\": rpc error: code = NotFound desc = could not find container \"ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2\": container with ID starting with ad892a18598fc1cabfc23f7dac610f4859983afd743753e55cae24f03bb90ea2 not found: ID does not exist" Apr 25 00:43:16.016957 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.016878 2576 scope.go:117] "RemoveContainer" containerID="6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3" Apr 25 00:43:16.017148 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:43:16.017130 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3\": container with ID starting with 6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3 not found: ID does not exist" containerID="6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3" Apr 25 00:43:16.017213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.017151 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3"} err="failed to get container status \"6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3\": rpc error: code = NotFound desc = could not find container \"6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3\": container with ID starting with 6582d5a74e05a46c539d0f1c45d4a975d4c3429c8af7da317760b955f01bf1a3 not found: ID does not exist" Apr 25 00:43:16.017213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.017167 2576 scope.go:117] "RemoveContainer" containerID="abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5" Apr 25 00:43:16.017391 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:43:16.017374 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5\": container with ID starting with abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5 not found: ID does not exist" containerID="abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5" Apr 25 00:43:16.017456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.017394 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5"} err="failed to get container status \"abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5\": rpc error: code = NotFound desc = could not find container \"abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5\": container with ID starting with abcaeba355595988f12edcfa1c9d0a54edb8c034ba2ece766129299d81caddc5 not found: ID does not exist" Apr 25 00:43:16.020056 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.020035 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5c48844574-8v4rv"] Apr 25 00:43:16.315367 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.315337 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da551da4-85dd-4794-bb23-8cd99a952716" path="/var/lib/kubelet/pods/da551da4-85dd-4794-bb23-8cd99a952716/volumes" Apr 25 00:43:16.997758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.997721 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerID="e3f128b5ddaab18e47991a147ed2569f7eda6eb191d616719c762a5890581785" exitCode=0 Apr 25 00:43:16.997944 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:16.997795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerDied","Data":"e3f128b5ddaab18e47991a147ed2569f7eda6eb191d616719c762a5890581785"} Apr 25 00:43:21.017523 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:21.017498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerStarted","Data":"7e827b1cc3015c4e133374e66369f80485c0bcbd5016ece15f4f107e8456c83f"} Apr 25 00:43:22.022422 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:22.022380 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerStarted","Data":"278e997249d1bfe33564e992262ac624d68f3b51e1cd91cadbc80d90d14161d1"} Apr 25 00:43:22.022860 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:22.022569 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:22.043598 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:22.043553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podStartSLOduration=7.164421144 podStartE2EDuration="11.043541059s" podCreationTimestamp="2026-04-25 00:43:11 +0000 UTC" firstStartedPulling="2026-04-25 00:43:16.998870971 +0000 UTC m=+2957.279987868" lastFinishedPulling="2026-04-25 00:43:20.877990885 +0000 UTC m=+2961.159107783" observedRunningTime="2026-04-25 00:43:22.042427645 +0000 UTC m=+2962.323544566" watchObservedRunningTime="2026-04-25 00:43:22.043541059 +0000 UTC m=+2962.324658387" Apr 25 00:43:23.025770 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:23.025731 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:23.026994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:23.026966 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 25 00:43:24.028188 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:24.028147 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 25 00:43:29.032469 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:29.032437 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:29.032897 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:29.032873 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 25 00:43:39.034045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:39.034015 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:43:51.714135 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.714099 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq"] Apr 25 00:43:51.714585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.714539 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" containerID="cri-o://7e827b1cc3015c4e133374e66369f80485c0bcbd5016ece15f4f107e8456c83f" gracePeriod=30 Apr 25 00:43:51.714671 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.714583 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" containerID="cri-o://278e997249d1bfe33564e992262ac624d68f3b51e1cd91cadbc80d90d14161d1" gracePeriod=30 Apr 25 00:43:51.809211 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809181 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7"] Apr 25 00:43:51.809513 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809499 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="storage-initializer" Apr 25 00:43:51.809575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809515 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="storage-initializer" Apr 25 00:43:51.809575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809523 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kube-rbac-proxy" Apr 25 00:43:51.809575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809529 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kube-rbac-proxy" Apr 25 00:43:51.809575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809536 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" Apr 25 00:43:51.809575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809542 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" Apr 25 00:43:51.809753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809591 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kube-rbac-proxy" Apr 25 00:43:51.809753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.809598 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="da551da4-85dd-4794-bb23-8cd99a952716" containerName="kserve-container" Apr 25 00:43:51.812693 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.812674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.815004 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.814980 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 25 00:43:51.815091 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.815061 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:43:51.825319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.825297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7"] Apr 25 00:43:51.837142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.837116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.837239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.837142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.837239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.837165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea78c718-ea55-4e9c-990b-9f214cf08ef6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.837239 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.837185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txhzj\" (UniqueName: \"kubernetes.io/projected/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kube-api-access-txhzj\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.938433 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.938405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.938538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.938434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.938538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.938456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea78c718-ea55-4e9c-990b-9f214cf08ef6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.938636 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:43:51.938553 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 25 00:43:51.938636 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.938590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txhzj\" (UniqueName: \"kubernetes.io/projected/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kube-api-access-txhzj\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.938636 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:43:51.938618 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls podName:ea78c718-ea55-4e9c-990b-9f214cf08ef6 nodeName:}" failed. No retries permitted until 2026-04-25 00:43:52.438597964 +0000 UTC m=+2992.719714862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" (UID: "ea78c718-ea55-4e9c-990b-9f214cf08ef6") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 25 00:43:51.938853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.938830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.939121 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.939105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea78c718-ea55-4e9c-990b-9f214cf08ef6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:51.951353 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:51.951326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txhzj\" (UniqueName: \"kubernetes.io/projected/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kube-api-access-txhzj\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:52.105765 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:52.105732 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerID="278e997249d1bfe33564e992262ac624d68f3b51e1cd91cadbc80d90d14161d1" exitCode=2 Apr 25 00:43:52.105949 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:52.105800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerDied","Data":"278e997249d1bfe33564e992262ac624d68f3b51e1cd91cadbc80d90d14161d1"} Apr 25 00:43:52.442238 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:52.442138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:52.444765 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:52.444741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:52.723351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:52.723264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:52.843101 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:52.843074 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7"] Apr 25 00:43:52.845669 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:43:52.845636 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea78c718_ea55_4e9c_990b_9f214cf08ef6.slice/crio-4bfbb6cd30710eb02aece20a8790445e5c2bfc17ad42b704f628999389c99e75 WatchSource:0}: Error finding container 4bfbb6cd30710eb02aece20a8790445e5c2bfc17ad42b704f628999389c99e75: Status 404 returned error can't find the container with id 4bfbb6cd30710eb02aece20a8790445e5c2bfc17ad42b704f628999389c99e75 Apr 25 00:43:53.110425 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:53.110388 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerStarted","Data":"0a31a4efdcd065c7a0fdea7eae4c3153b010f8660ddfe60574be86725d1ec057"} Apr 25 00:43:53.110425 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:53.110427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerStarted","Data":"4bfbb6cd30710eb02aece20a8790445e5c2bfc17ad42b704f628999389c99e75"} Apr 25 00:43:54.029277 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:54.029241 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 25 00:43:58.126699 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:58.126663 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerID="0a31a4efdcd065c7a0fdea7eae4c3153b010f8660ddfe60574be86725d1ec057" exitCode=0 Apr 25 00:43:58.127102 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:58.126740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerDied","Data":"0a31a4efdcd065c7a0fdea7eae4c3153b010f8660ddfe60574be86725d1ec057"} Apr 25 00:43:59.029200 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:59.029158 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 25 00:43:59.131680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:59.131647 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerStarted","Data":"1416d55d8a3bfeb4f85eeec4a1ee36fe9fd845ecfaa6a512ba91d39a0ff95266"} Apr 25 00:43:59.131680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:59.131685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerStarted","Data":"10d2a049c25ada01830c06f5f98c97f1cc950b45536918ebefa04747f85a8fe7"} Apr 25 00:43:59.132062 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:59.131891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:43:59.150496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:43:59.150453 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podStartSLOduration=8.150435856 podStartE2EDuration="8.150435856s" podCreationTimestamp="2026-04-25 00:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:43:59.14902542 +0000 UTC m=+2999.430142352" watchObservedRunningTime="2026-04-25 00:43:59.150435856 +0000 UTC m=+2999.431552776" Apr 25 00:44:00.134498 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:00.134465 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:44:00.135607 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:00.135583 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 25 00:44:01.137641 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:01.137599 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 25 00:44:04.028424 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:04.028377 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 25 00:44:04.028811 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:04.028494 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:44:06.142476 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:06.142443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:44:06.143124 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:06.143096 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 25 00:44:09.028955 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:09.028899 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 25 00:44:14.028603 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:14.028564 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 25 00:44:16.143523 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:16.143497 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:44:19.028291 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:19.028254 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 25 00:44:21.736352 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:44:21.736315 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2890b2b_4443_40a7_b918_3e97d3d46292.slice/crio-de2efabd5c1882ab2e688e56622d7578ccd1c068fcd432bc1cd1a21f3c0a7e27\": RecentStats: unable to find data in memory cache]" Apr 25 00:44:22.195820 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.195785 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerID="7e827b1cc3015c4e133374e66369f80485c0bcbd5016ece15f4f107e8456c83f" exitCode=137 Apr 25 00:44:22.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.195862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerDied","Data":"7e827b1cc3015c4e133374e66369f80485c0bcbd5016ece15f4f107e8456c83f"} Apr 25 00:44:22.354720 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.354696 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:44:22.468253 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.468172 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2890b2b-4443-40a7-b918-3e97d3d46292-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"a2890b2b-4443-40a7-b918-3e97d3d46292\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " Apr 25 00:44:22.468253 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.468218 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2890b2b-4443-40a7-b918-3e97d3d46292-proxy-tls\") pod \"a2890b2b-4443-40a7-b918-3e97d3d46292\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " Apr 25 00:44:22.468481 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.468285 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2890b2b-4443-40a7-b918-3e97d3d46292-kserve-provision-location\") pod \"a2890b2b-4443-40a7-b918-3e97d3d46292\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " Apr 25 00:44:22.468481 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.468304 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zplzb\" (UniqueName: \"kubernetes.io/projected/a2890b2b-4443-40a7-b918-3e97d3d46292-kube-api-access-zplzb\") pod \"a2890b2b-4443-40a7-b918-3e97d3d46292\" (UID: \"a2890b2b-4443-40a7-b918-3e97d3d46292\") " Apr 25 00:44:22.468597 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.468578 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2890b2b-4443-40a7-b918-3e97d3d46292-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "a2890b2b-4443-40a7-b918-3e97d3d46292" (UID: "a2890b2b-4443-40a7-b918-3e97d3d46292"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:44:22.470503 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.470481 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2890b2b-4443-40a7-b918-3e97d3d46292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a2890b2b-4443-40a7-b918-3e97d3d46292" (UID: "a2890b2b-4443-40a7-b918-3e97d3d46292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:44:22.470604 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.470554 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2890b2b-4443-40a7-b918-3e97d3d46292-kube-api-access-zplzb" (OuterVolumeSpecName: "kube-api-access-zplzb") pod "a2890b2b-4443-40a7-b918-3e97d3d46292" (UID: "a2890b2b-4443-40a7-b918-3e97d3d46292"). InnerVolumeSpecName "kube-api-access-zplzb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:44:22.484004 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.483972 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2890b2b-4443-40a7-b918-3e97d3d46292-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2890b2b-4443-40a7-b918-3e97d3d46292" (UID: "a2890b2b-4443-40a7-b918-3e97d3d46292"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:44:22.568813 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.568756 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2890b2b-4443-40a7-b918-3e97d3d46292-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:44:22.568813 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.568803 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zplzb\" (UniqueName: \"kubernetes.io/projected/a2890b2b-4443-40a7-b918-3e97d3d46292-kube-api-access-zplzb\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:44:22.568813 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.568816 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a2890b2b-4443-40a7-b918-3e97d3d46292-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:44:22.568813 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:22.568826 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a2890b2b-4443-40a7-b918-3e97d3d46292-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:44:23.200579 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.200550 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" Apr 25 00:44:23.201086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.200525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq" event={"ID":"a2890b2b-4443-40a7-b918-3e97d3d46292","Type":"ContainerDied","Data":"de2efabd5c1882ab2e688e56622d7578ccd1c068fcd432bc1cd1a21f3c0a7e27"} Apr 25 00:44:23.201086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.200670 2576 scope.go:117] "RemoveContainer" containerID="278e997249d1bfe33564e992262ac624d68f3b51e1cd91cadbc80d90d14161d1" Apr 25 00:44:23.209443 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.209422 2576 scope.go:117] "RemoveContainer" containerID="7e827b1cc3015c4e133374e66369f80485c0bcbd5016ece15f4f107e8456c83f" Apr 25 00:44:23.216391 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.216375 2576 scope.go:117] "RemoveContainer" containerID="e3f128b5ddaab18e47991a147ed2569f7eda6eb191d616719c762a5890581785" Apr 25 00:44:23.221560 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.221538 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq"] Apr 25 00:44:23.227484 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:23.227463 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-zg7cq"] Apr 25 00:44:24.315430 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:24.315400 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" path="/var/lib/kubelet/pods/a2890b2b-4443-40a7-b918-3e97d3d46292/volumes" Apr 25 00:44:32.207353 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.207318 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7"] Apr 25 00:44:32.207833 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.207596 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" containerID="cri-o://10d2a049c25ada01830c06f5f98c97f1cc950b45536918ebefa04747f85a8fe7" gracePeriod=30 Apr 25 00:44:32.207833 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.207671 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" containerID="cri-o://1416d55d8a3bfeb4f85eeec4a1ee36fe9fd845ecfaa6a512ba91d39a0ff95266" gracePeriod=30 Apr 25 00:44:32.300777 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.300742 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f"] Apr 25 00:44:32.301046 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301033 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="storage-initializer" Apr 25 00:44:32.301090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301047 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="storage-initializer" Apr 25 00:44:32.301090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301061 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" Apr 25 00:44:32.301090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301067 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" Apr 25 00:44:32.301090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301077 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" Apr 25 00:44:32.301090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301083 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" Apr 25 00:44:32.301293 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301134 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kserve-container" Apr 25 00:44:32.301293 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.301142 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2890b2b-4443-40a7-b918-3e97d3d46292" containerName="kube-rbac-proxy" Apr 25 00:44:32.304195 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.304176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.306661 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.306633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 25 00:44:32.306661 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.306650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 25 00:44:32.315398 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.315378 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f"] Apr 25 00:44:32.443285 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.443253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1867c476-c38b-4261-9709-c07b79439cce-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.443406 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.443299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1867c476-c38b-4261-9709-c07b79439cce-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.443406 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.443378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1867c476-c38b-4261-9709-c07b79439cce-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.443406 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.443402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4m8g\" (UniqueName: \"kubernetes.io/projected/1867c476-c38b-4261-9709-c07b79439cce-kube-api-access-g4m8g\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.544266 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.544233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1867c476-c38b-4261-9709-c07b79439cce-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.544464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.544297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1867c476-c38b-4261-9709-c07b79439cce-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.544464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.544329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4m8g\" (UniqueName: \"kubernetes.io/projected/1867c476-c38b-4261-9709-c07b79439cce-kube-api-access-g4m8g\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.544464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.544359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1867c476-c38b-4261-9709-c07b79439cce-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.544742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.544724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1867c476-c38b-4261-9709-c07b79439cce-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.545084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.545061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1867c476-c38b-4261-9709-c07b79439cce-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.546906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.546887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1867c476-c38b-4261-9709-c07b79439cce-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.552582 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.552557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4m8g\" (UniqueName: \"kubernetes.io/projected/1867c476-c38b-4261-9709-c07b79439cce-kube-api-access-g4m8g\") pod \"isvc-triton-predictor-84bb65d94b-xk24f\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.615157 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.615126 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:44:32.738308 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:32.738206 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f"] Apr 25 00:44:32.740804 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:44:32.740776 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1867c476_c38b_4261_9709_c07b79439cce.slice/crio-e60632ad8fa32c223f3d550aad8356a3f1c913c28e7dd2c74f64a81402e104b9 WatchSource:0}: Error finding container e60632ad8fa32c223f3d550aad8356a3f1c913c28e7dd2c74f64a81402e104b9: Status 404 returned error can't find the container with id e60632ad8fa32c223f3d550aad8356a3f1c913c28e7dd2c74f64a81402e104b9 Apr 25 00:44:33.232160 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:33.232130 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerID="1416d55d8a3bfeb4f85eeec4a1ee36fe9fd845ecfaa6a512ba91d39a0ff95266" exitCode=2 Apr 25 00:44:33.232516 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:33.232206 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerDied","Data":"1416d55d8a3bfeb4f85eeec4a1ee36fe9fd845ecfaa6a512ba91d39a0ff95266"} Apr 25 00:44:33.233421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:33.233393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerStarted","Data":"ca63627b1755b2b72fc440a43c143c0019bec302b87135dd556aed80413e881c"} Apr 25 00:44:33.233421 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:33.233423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerStarted","Data":"e60632ad8fa32c223f3d550aad8356a3f1c913c28e7dd2c74f64a81402e104b9"} Apr 25 00:44:36.137932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:36.137872 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 25 00:44:37.150110 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:37.150078 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:44:37.155302 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:37.155281 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:44:37.246639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:37.246605 2576 generic.go:358] "Generic (PLEG): container finished" podID="1867c476-c38b-4261-9709-c07b79439cce" containerID="ca63627b1755b2b72fc440a43c143c0019bec302b87135dd556aed80413e881c" exitCode=0 Apr 25 00:44:37.246780 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:37.246679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerDied","Data":"ca63627b1755b2b72fc440a43c143c0019bec302b87135dd556aed80413e881c"} Apr 25 00:44:41.138783 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:41.138383 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 25 00:44:46.139017 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:46.138902 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 25 00:44:46.139723 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:46.139079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:44:51.138808 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:51.138746 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 25 00:44:56.138509 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:44:56.138453 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 25 00:45:01.138578 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:01.138523 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 25 00:45:02.356677 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:02.356641 2576 generic.go:358] "Generic (PLEG): container finished" podID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerID="10d2a049c25ada01830c06f5f98c97f1cc950b45536918ebefa04747f85a8fe7" exitCode=137 Apr 25 00:45:02.357190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:02.356723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerDied","Data":"10d2a049c25ada01830c06f5f98c97f1cc950b45536918ebefa04747f85a8fe7"} Apr 25 00:45:02.935500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:02.935471 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:45:03.009669 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.009587 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls\") pod \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " Apr 25 00:45:03.009669 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.009639 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txhzj\" (UniqueName: \"kubernetes.io/projected/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kube-api-access-txhzj\") pod \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " Apr 25 00:45:03.009669 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.009669 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kserve-provision-location\") pod \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " Apr 25 00:45:03.009982 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.009751 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea78c718-ea55-4e9c-990b-9f214cf08ef6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\" (UID: \"ea78c718-ea55-4e9c-990b-9f214cf08ef6\") " Apr 25 00:45:03.010327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.010272 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea78c718-ea55-4e9c-990b-9f214cf08ef6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "ea78c718-ea55-4e9c-990b-9f214cf08ef6" (UID: "ea78c718-ea55-4e9c-990b-9f214cf08ef6"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:45:03.013297 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.013256 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea78c718-ea55-4e9c-990b-9f214cf08ef6" (UID: "ea78c718-ea55-4e9c-990b-9f214cf08ef6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:45:03.013525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.013500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kube-api-access-txhzj" (OuterVolumeSpecName: "kube-api-access-txhzj") pod "ea78c718-ea55-4e9c-990b-9f214cf08ef6" (UID: "ea78c718-ea55-4e9c-990b-9f214cf08ef6"). InnerVolumeSpecName "kube-api-access-txhzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:45:03.018718 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.018686 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea78c718-ea55-4e9c-990b-9f214cf08ef6" (UID: "ea78c718-ea55-4e9c-990b-9f214cf08ef6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:45:03.111177 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.111124 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea78c718-ea55-4e9c-990b-9f214cf08ef6-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:45:03.111177 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.111165 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea78c718-ea55-4e9c-990b-9f214cf08ef6-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:45:03.111177 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.111177 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-txhzj\" (UniqueName: \"kubernetes.io/projected/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kube-api-access-txhzj\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:45:03.111483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.111191 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea78c718-ea55-4e9c-990b-9f214cf08ef6-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:45:03.362748 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.362712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" event={"ID":"ea78c718-ea55-4e9c-990b-9f214cf08ef6","Type":"ContainerDied","Data":"4bfbb6cd30710eb02aece20a8790445e5c2bfc17ad42b704f628999389c99e75"} Apr 25 00:45:03.363244 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.362787 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7" Apr 25 00:45:03.363244 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.362769 2576 scope.go:117] "RemoveContainer" containerID="1416d55d8a3bfeb4f85eeec4a1ee36fe9fd845ecfaa6a512ba91d39a0ff95266" Apr 25 00:45:03.374848 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.374828 2576 scope.go:117] "RemoveContainer" containerID="10d2a049c25ada01830c06f5f98c97f1cc950b45536918ebefa04747f85a8fe7" Apr 25 00:45:03.384312 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.384290 2576 scope.go:117] "RemoveContainer" containerID="0a31a4efdcd065c7a0fdea7eae4c3153b010f8660ddfe60574be86725d1ec057" Apr 25 00:45:03.388908 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.388871 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7"] Apr 25 00:45:03.392792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:03.392752 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-m4fj7"] Apr 25 00:45:04.317034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:45:04.316998 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" path="/var/lib/kubelet/pods/ea78c718-ea55-4e9c-990b-9f214cf08ef6/volumes" Apr 25 00:46:32.648829 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:32.648790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerStarted","Data":"bec0ec98c814c5683d8af48adfdf6a7d303fe8c36a32cbe1271de24f8f1f0346"} Apr 25 00:46:32.648829 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:32.648830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerStarted","Data":"ad8c4fdcaa8b4a090a9693c81582654da1f9aef82fa4f5e030a490693fe1d8ac"} Apr 25 00:46:32.649352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:32.648951 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:46:32.667167 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:32.667123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" podStartSLOduration=5.98499795 podStartE2EDuration="2m0.667108757s" podCreationTimestamp="2026-04-25 00:44:32 +0000 UTC" firstStartedPulling="2026-04-25 00:44:37.24771076 +0000 UTC m=+3037.528827658" lastFinishedPulling="2026-04-25 00:46:31.929821568 +0000 UTC m=+3152.210938465" observedRunningTime="2026-04-25 00:46:32.666133547 +0000 UTC m=+3152.947250469" watchObservedRunningTime="2026-04-25 00:46:32.667108757 +0000 UTC m=+3152.948225676" Apr 25 00:46:33.652051 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:33.652024 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:46:33.652789 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:33.652764 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 25 00:46:34.655194 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:34.655156 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 25 00:46:39.659525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:39.659498 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:46:39.660192 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:39.660173 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:46:44.745131 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.745101 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f"] Apr 25 00:46:44.745783 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.745403 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kserve-container" containerID="cri-o://ad8c4fdcaa8b4a090a9693c81582654da1f9aef82fa4f5e030a490693fe1d8ac" gracePeriod=30 Apr 25 00:46:44.745783 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.745447 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kube-rbac-proxy" containerID="cri-o://bec0ec98c814c5683d8af48adfdf6a7d303fe8c36a32cbe1271de24f8f1f0346" gracePeriod=30 Apr 25 00:46:44.844180 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844150 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg"] Apr 25 00:46:44.844431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844418 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" Apr 25 00:46:44.844483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844433 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" Apr 25 00:46:44.844483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844445 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="storage-initializer" Apr 25 00:46:44.844483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="storage-initializer" Apr 25 00:46:44.844483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844461 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" Apr 25 00:46:44.844483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844466 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" Apr 25 00:46:44.844639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844527 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kserve-container" Apr 25 00:46:44.844639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.844535 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea78c718-ea55-4e9c-990b-9f214cf08ef6" containerName="kube-rbac-proxy" Apr 25 00:46:44.855897 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.855861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:44.857339 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.857297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg"] Apr 25 00:46:44.858069 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.858044 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 25 00:46:44.858174 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:44.858105 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 25 00:46:45.039739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.039705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.039983 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.039747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5287581e-5050-452c-99c7-4d07adec38e5-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.039983 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.039766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5287581e-5050-452c-99c7-4d07adec38e5-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.039983 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.039784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllz4\" (UniqueName: \"kubernetes.io/projected/5287581e-5050-452c-99c7-4d07adec38e5-kube-api-access-rllz4\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.140525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.140495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.140728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.140531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5287581e-5050-452c-99c7-4d07adec38e5-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.140728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.140549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5287581e-5050-452c-99c7-4d07adec38e5-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.140728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.140573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rllz4\" (UniqueName: \"kubernetes.io/projected/5287581e-5050-452c-99c7-4d07adec38e5-kube-api-access-rllz4\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.140728 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:46:45.140647 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 25 00:46:45.140728 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:46:45.140718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls podName:5287581e-5050-452c-99c7-4d07adec38e5 nodeName:}" failed. No retries permitted until 2026-04-25 00:46:45.640701247 +0000 UTC m=+3165.921818149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-l8smg" (UID: "5287581e-5050-452c-99c7-4d07adec38e5") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 25 00:46:45.140983 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.140966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5287581e-5050-452c-99c7-4d07adec38e5-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.141229 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.141213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5287581e-5050-452c-99c7-4d07adec38e5-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.149319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.149297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllz4\" (UniqueName: \"kubernetes.io/projected/5287581e-5050-452c-99c7-4d07adec38e5-kube-api-access-rllz4\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.646091 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.646049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.648588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.648569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-l8smg\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:45.688542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.688513 2576 generic.go:358] "Generic (PLEG): container finished" podID="1867c476-c38b-4261-9709-c07b79439cce" containerID="bec0ec98c814c5683d8af48adfdf6a7d303fe8c36a32cbe1271de24f8f1f0346" exitCode=2 Apr 25 00:46:45.688685 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.688587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerDied","Data":"bec0ec98c814c5683d8af48adfdf6a7d303fe8c36a32cbe1271de24f8f1f0346"} Apr 25 00:46:45.767229 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:45.767201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:46:46.020819 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:46.020799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg"] Apr 25 00:46:46.023633 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:46:46.023607 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5287581e_5050_452c_99c7_4d07adec38e5.slice/crio-c5115a800d75619007d7ee024acc0457ebae828da374f89076835073f6c019bf WatchSource:0}: Error finding container c5115a800d75619007d7ee024acc0457ebae828da374f89076835073f6c019bf: Status 404 returned error can't find the container with id c5115a800d75619007d7ee024acc0457ebae828da374f89076835073f6c019bf Apr 25 00:46:46.693407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:46.693377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerStarted","Data":"f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba"} Apr 25 00:46:46.693407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:46.693411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerStarted","Data":"c5115a800d75619007d7ee024acc0457ebae828da374f89076835073f6c019bf"} Apr 25 00:46:47.703388 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.703353 2576 generic.go:358] "Generic (PLEG): container finished" podID="1867c476-c38b-4261-9709-c07b79439cce" containerID="ad8c4fdcaa8b4a090a9693c81582654da1f9aef82fa4f5e030a490693fe1d8ac" exitCode=0 Apr 25 00:46:47.703774 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.703425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerDied","Data":"ad8c4fdcaa8b4a090a9693c81582654da1f9aef82fa4f5e030a490693fe1d8ac"} Apr 25 00:46:47.703774 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.703458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" event={"ID":"1867c476-c38b-4261-9709-c07b79439cce","Type":"ContainerDied","Data":"e60632ad8fa32c223f3d550aad8356a3f1c913c28e7dd2c74f64a81402e104b9"} Apr 25 00:46:47.703774 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.703480 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e60632ad8fa32c223f3d550aad8356a3f1c913c28e7dd2c74f64a81402e104b9" Apr 25 00:46:47.708543 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.708525 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:46:47.759680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.759612 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1867c476-c38b-4261-9709-c07b79439cce-kserve-provision-location\") pod \"1867c476-c38b-4261-9709-c07b79439cce\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " Apr 25 00:46:47.759680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.759657 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4m8g\" (UniqueName: \"kubernetes.io/projected/1867c476-c38b-4261-9709-c07b79439cce-kube-api-access-g4m8g\") pod \"1867c476-c38b-4261-9709-c07b79439cce\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " Apr 25 00:46:47.760042 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.760017 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1867c476-c38b-4261-9709-c07b79439cce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1867c476-c38b-4261-9709-c07b79439cce" (UID: "1867c476-c38b-4261-9709-c07b79439cce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:46:47.761856 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.761832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1867c476-c38b-4261-9709-c07b79439cce-kube-api-access-g4m8g" (OuterVolumeSpecName: "kube-api-access-g4m8g") pod "1867c476-c38b-4261-9709-c07b79439cce" (UID: "1867c476-c38b-4261-9709-c07b79439cce"). InnerVolumeSpecName "kube-api-access-g4m8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:46:47.860223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.860200 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1867c476-c38b-4261-9709-c07b79439cce-proxy-tls\") pod \"1867c476-c38b-4261-9709-c07b79439cce\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " Apr 25 00:46:47.860341 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.860260 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1867c476-c38b-4261-9709-c07b79439cce-isvc-triton-kube-rbac-proxy-sar-config\") pod \"1867c476-c38b-4261-9709-c07b79439cce\" (UID: \"1867c476-c38b-4261-9709-c07b79439cce\") " Apr 25 00:46:47.860386 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.860342 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4m8g\" (UniqueName: \"kubernetes.io/projected/1867c476-c38b-4261-9709-c07b79439cce-kube-api-access-g4m8g\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:46:47.860386 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.860358 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1867c476-c38b-4261-9709-c07b79439cce-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:46:47.860608 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.860581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1867c476-c38b-4261-9709-c07b79439cce-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "1867c476-c38b-4261-9709-c07b79439cce" (UID: "1867c476-c38b-4261-9709-c07b79439cce"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:46:47.862339 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.862314 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1867c476-c38b-4261-9709-c07b79439cce-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1867c476-c38b-4261-9709-c07b79439cce" (UID: "1867c476-c38b-4261-9709-c07b79439cce"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:46:47.960730 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.960696 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1867c476-c38b-4261-9709-c07b79439cce-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:46:47.960730 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:47.960724 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1867c476-c38b-4261-9709-c07b79439cce-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:46:48.706315 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:48.706235 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f" Apr 25 00:46:48.722482 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:48.722447 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f"] Apr 25 00:46:48.725646 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:48.725620 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-xk24f"] Apr 25 00:46:50.315604 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:50.315573 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1867c476-c38b-4261-9709-c07b79439cce" path="/var/lib/kubelet/pods/1867c476-c38b-4261-9709-c07b79439cce/volumes" Apr 25 00:46:50.712581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:50.712550 2576 generic.go:358] "Generic (PLEG): container finished" podID="5287581e-5050-452c-99c7-4d07adec38e5" containerID="f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba" exitCode=0 Apr 25 00:46:50.712581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:46:50.712586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerDied","Data":"f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba"} Apr 25 00:47:10.773885 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:10.773850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerStarted","Data":"cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4"} Apr 25 00:47:10.773885 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:10.773891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerStarted","Data":"4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825"} Apr 25 00:47:10.774338 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:10.774209 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:47:10.774377 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:10.774347 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:47:10.775264 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:10.775240 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:47:10.798500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:10.798457 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podStartSLOduration=7.821147612 podStartE2EDuration="26.798446872s" podCreationTimestamp="2026-04-25 00:46:44 +0000 UTC" firstStartedPulling="2026-04-25 00:46:50.713873849 +0000 UTC m=+3170.994990746" lastFinishedPulling="2026-04-25 00:47:09.691173105 +0000 UTC m=+3189.972290006" observedRunningTime="2026-04-25 00:47:10.796700053 +0000 UTC m=+3191.077816973" watchObservedRunningTime="2026-04-25 00:47:10.798446872 +0000 UTC m=+3191.079563792" Apr 25 00:47:11.777047 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:11.777007 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:47:16.780840 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:16.780812 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:47:16.781334 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:16.781311 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:47:26.781320 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:26.781274 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:47:36.782084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:36.782043 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:47:46.782064 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:46.782018 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:47:56.781622 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:47:56.781583 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:48:06.782138 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:06.782106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:48:14.929867 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:14.929820 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg"] Apr 25 00:48:14.930375 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:14.930259 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" containerID="cri-o://4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825" gracePeriod=30 Apr 25 00:48:14.930375 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:14.930302 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kube-rbac-proxy" containerID="cri-o://cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4" gracePeriod=30 Apr 25 00:48:15.033935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.033872 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr"] Apr 25 00:48:15.034220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034206 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kube-rbac-proxy" Apr 25 00:48:15.034298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034224 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kube-rbac-proxy" Apr 25 00:48:15.034298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034246 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="storage-initializer" Apr 25 00:48:15.034298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034256 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="storage-initializer" Apr 25 00:48:15.034298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034270 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kserve-container" Apr 25 00:48:15.034298 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034277 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kserve-container" Apr 25 00:48:15.034483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034322 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kserve-container" Apr 25 00:48:15.034483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.034329 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1867c476-c38b-4261-9709-c07b79439cce" containerName="kube-rbac-proxy" Apr 25 00:48:15.037365 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.037338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.039558 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.039533 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 25 00:48:15.039691 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.039533 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 25 00:48:15.047981 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.047943 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr"] Apr 25 00:48:15.107990 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.107952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.108142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.108020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8491019-8b5c-4128-acf7-92e64fd24250-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.108142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.108046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpw5f\" (UniqueName: \"kubernetes.io/projected/c8491019-8b5c-4128-acf7-92e64fd24250-kube-api-access-fpw5f\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.108142 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.108064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8491019-8b5c-4128-acf7-92e64fd24250-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.208412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.208326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8491019-8b5c-4128-acf7-92e64fd24250-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.208412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.208371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpw5f\" (UniqueName: \"kubernetes.io/projected/c8491019-8b5c-4128-acf7-92e64fd24250-kube-api-access-fpw5f\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.208412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.208397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8491019-8b5c-4128-acf7-92e64fd24250-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.208693 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.208446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.208693 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:48:15.208594 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-serving-cert: secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 25 00:48:15.208693 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:48:15.208665 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls podName:c8491019-8b5c-4128-acf7-92e64fd24250 nodeName:}" failed. No retries permitted until 2026-04-25 00:48:15.708642669 +0000 UTC m=+3255.989759581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls") pod "isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" (UID: "c8491019-8b5c-4128-acf7-92e64fd24250") : secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 25 00:48:15.208863 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.208787 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8491019-8b5c-4128-acf7-92e64fd24250-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.208996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.208978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8491019-8b5c-4128-acf7-92e64fd24250-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.216807 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.216786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpw5f\" (UniqueName: \"kubernetes.io/projected/c8491019-8b5c-4128-acf7-92e64fd24250-kube-api-access-fpw5f\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.712383 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.712351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.714900 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.714869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.948507 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.948473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:15.957588 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.957562 2576 generic.go:358] "Generic (PLEG): container finished" podID="5287581e-5050-452c-99c7-4d07adec38e5" containerID="cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4" exitCode=2 Apr 25 00:48:15.957713 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:15.957630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerDied","Data":"cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4"} Apr 25 00:48:16.068046 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:16.068014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr"] Apr 25 00:48:16.073040 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:48:16.073005 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8491019_8b5c_4128_acf7_92e64fd24250.slice/crio-9de62ac68bd3a83e4ab8526f0acefd69bf3fb06a7f31fbe4a772d3f4c296a34e WatchSource:0}: Error finding container 9de62ac68bd3a83e4ab8526f0acefd69bf3fb06a7f31fbe4a772d3f4c296a34e: Status 404 returned error can't find the container with id 9de62ac68bd3a83e4ab8526f0acefd69bf3fb06a7f31fbe4a772d3f4c296a34e Apr 25 00:48:16.074727 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:16.074704 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:48:16.777606 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:16.777567 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.56:8643/healthz\": dial tcp 10.134.0.56:8643: connect: connection refused" Apr 25 00:48:16.781859 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:16.781826 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 25 00:48:16.961742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:16.961706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerStarted","Data":"8429b79f2b11810d1f29d8995987cb0892823179111de4ce97ccc8c195273c76"} Apr 25 00:48:16.962103 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:16.961749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerStarted","Data":"9de62ac68bd3a83e4ab8526f0acefd69bf3fb06a7f31fbe4a772d3f4c296a34e"} Apr 25 00:48:18.464173 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.464152 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:48:18.531513 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.531483 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls\") pod \"5287581e-5050-452c-99c7-4d07adec38e5\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " Apr 25 00:48:18.531711 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.531526 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5287581e-5050-452c-99c7-4d07adec38e5-kserve-provision-location\") pod \"5287581e-5050-452c-99c7-4d07adec38e5\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " Apr 25 00:48:18.531711 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.531563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rllz4\" (UniqueName: \"kubernetes.io/projected/5287581e-5050-452c-99c7-4d07adec38e5-kube-api-access-rllz4\") pod \"5287581e-5050-452c-99c7-4d07adec38e5\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " Apr 25 00:48:18.531711 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.531588 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5287581e-5050-452c-99c7-4d07adec38e5-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"5287581e-5050-452c-99c7-4d07adec38e5\" (UID: \"5287581e-5050-452c-99c7-4d07adec38e5\") " Apr 25 00:48:18.531975 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.531939 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5287581e-5050-452c-99c7-4d07adec38e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5287581e-5050-452c-99c7-4d07adec38e5" (UID: "5287581e-5050-452c-99c7-4d07adec38e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:48:18.532107 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.532062 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5287581e-5050-452c-99c7-4d07adec38e5-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "5287581e-5050-452c-99c7-4d07adec38e5" (UID: "5287581e-5050-452c-99c7-4d07adec38e5"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:48:18.533711 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.533689 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5287581e-5050-452c-99c7-4d07adec38e5-kube-api-access-rllz4" (OuterVolumeSpecName: "kube-api-access-rllz4") pod "5287581e-5050-452c-99c7-4d07adec38e5" (UID: "5287581e-5050-452c-99c7-4d07adec38e5"). InnerVolumeSpecName "kube-api-access-rllz4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:48:18.533774 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.533709 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5287581e-5050-452c-99c7-4d07adec38e5" (UID: "5287581e-5050-452c-99c7-4d07adec38e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:48:18.632566 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.632533 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5287581e-5050-452c-99c7-4d07adec38e5-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:48:18.632566 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.632559 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5287581e-5050-452c-99c7-4d07adec38e5-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:48:18.632566 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.632568 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rllz4\" (UniqueName: \"kubernetes.io/projected/5287581e-5050-452c-99c7-4d07adec38e5-kube-api-access-rllz4\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:48:18.632800 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.632580 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5287581e-5050-452c-99c7-4d07adec38e5-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:48:18.967932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.967816 2576 generic.go:358] "Generic (PLEG): container finished" podID="5287581e-5050-452c-99c7-4d07adec38e5" containerID="4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825" exitCode=0 Apr 25 00:48:18.967932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.967883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerDied","Data":"4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825"} Apr 25 00:48:18.967932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.967897 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" Apr 25 00:48:18.967932 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.967933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg" event={"ID":"5287581e-5050-452c-99c7-4d07adec38e5","Type":"ContainerDied","Data":"c5115a800d75619007d7ee024acc0457ebae828da374f89076835073f6c019bf"} Apr 25 00:48:18.968227 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.967951 2576 scope.go:117] "RemoveContainer" containerID="cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4" Apr 25 00:48:18.976205 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.976188 2576 scope.go:117] "RemoveContainer" containerID="4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825" Apr 25 00:48:18.983376 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.983359 2576 scope.go:117] "RemoveContainer" containerID="f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba" Apr 25 00:48:18.990127 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.990104 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg"] Apr 25 00:48:18.992489 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.992467 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-l8smg"] Apr 25 00:48:18.993645 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.993611 2576 scope.go:117] "RemoveContainer" containerID="cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4" Apr 25 00:48:18.993926 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:48:18.993887 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4\": container with ID starting with cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4 not found: ID does not exist" containerID="cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4" Apr 25 00:48:18.994006 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.993939 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4"} err="failed to get container status \"cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4\": rpc error: code = NotFound desc = could not find container \"cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4\": container with ID starting with cbc76a17bf507816f4a85a651cad1298710b6683d273370f70552ab485c93cd4 not found: ID does not exist" Apr 25 00:48:18.994006 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.993960 2576 scope.go:117] "RemoveContainer" containerID="4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825" Apr 25 00:48:18.994290 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:48:18.994273 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825\": container with ID starting with 4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825 not found: ID does not exist" containerID="4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825" Apr 25 00:48:18.994342 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.994296 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825"} err="failed to get container status \"4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825\": rpc error: code = NotFound desc = could not find container \"4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825\": container with ID starting with 4695f18939398f5c34d70a8a4140cbbf21bd4c315d888fe5c2ec3fb652f0e825 not found: ID does not exist" Apr 25 00:48:18.994342 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.994311 2576 scope.go:117] "RemoveContainer" containerID="f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba" Apr 25 00:48:18.994497 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:48:18.994483 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba\": container with ID starting with f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba not found: ID does not exist" containerID="f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba" Apr 25 00:48:18.994538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:18.994499 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba"} err="failed to get container status \"f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba\": rpc error: code = NotFound desc = could not find container \"f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba\": container with ID starting with f9418c3cc056501fe2560a46a6c6118920472d7159d199f640cc4c762ffca1ba not found: ID does not exist" Apr 25 00:48:19.971667 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:19.971637 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8491019-8b5c-4128-acf7-92e64fd24250" containerID="8429b79f2b11810d1f29d8995987cb0892823179111de4ce97ccc8c195273c76" exitCode=0 Apr 25 00:48:19.972026 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:19.971704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerDied","Data":"8429b79f2b11810d1f29d8995987cb0892823179111de4ce97ccc8c195273c76"} Apr 25 00:48:20.315311 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:20.315269 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5287581e-5050-452c-99c7-4d07adec38e5" path="/var/lib/kubelet/pods/5287581e-5050-452c-99c7-4d07adec38e5/volumes" Apr 25 00:48:20.976876 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:20.976842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerStarted","Data":"475a7b566f34c3fc29ceee907725f2b95c14a55a60e200795ecb144db6c48137"} Apr 25 00:48:20.976876 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:20.976881 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerStarted","Data":"1a321231077007fa3b0cd768db5f175c964f2f76eeb1481bd23014cb6a4cc20b"} Apr 25 00:48:20.977321 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:20.977205 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:21.008736 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:21.008690 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" podStartSLOduration=6.008678708 podStartE2EDuration="6.008678708s" podCreationTimestamp="2026-04-25 00:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:48:21.006659527 +0000 UTC m=+3261.287776447" watchObservedRunningTime="2026-04-25 00:48:21.008678708 +0000 UTC m=+3261.289795619" Apr 25 00:48:21.979992 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:21.979963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:27.988939 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:27.988891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:48:57.997966 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:48:57.997931 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:49:05.080168 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.080132 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr"] Apr 25 00:49:05.080690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.080540 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kserve-container" containerID="cri-o://1a321231077007fa3b0cd768db5f175c964f2f76eeb1481bd23014cb6a4cc20b" gracePeriod=30 Apr 25 00:49:05.080690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.080610 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kube-rbac-proxy" containerID="cri-o://475a7b566f34c3fc29ceee907725f2b95c14a55a60e200795ecb144db6c48137" gracePeriod=30 Apr 25 00:49:05.160233 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160199 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9"] Apr 25 00:49:05.160479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160467 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="storage-initializer" Apr 25 00:49:05.160479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160480 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="storage-initializer" Apr 25 00:49:05.160571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160498 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kube-rbac-proxy" Apr 25 00:49:05.160571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160504 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kube-rbac-proxy" Apr 25 00:49:05.160571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160517 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" Apr 25 00:49:05.160571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160523 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" Apr 25 00:49:05.160571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160565 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kube-rbac-proxy" Apr 25 00:49:05.160571 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.160573 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5287581e-5050-452c-99c7-4d07adec38e5" containerName="kserve-container" Apr 25 00:49:05.165049 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.165031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.167427 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.167407 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 25 00:49:05.167521 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.167503 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 25 00:49:05.172733 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.172711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9"] Apr 25 00:49:05.276439 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.276408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7625aea-4beb-4a38-907f-9a757858ca0e-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.276565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.276461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.276565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.276511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnr8d\" (UniqueName: \"kubernetes.io/projected/c7625aea-4beb-4a38-907f-9a757858ca0e-kube-api-access-jnr8d\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.276565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.276557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7625aea-4beb-4a38-907f-9a757858ca0e-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.377944 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.377841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.377944 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.377894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnr8d\" (UniqueName: \"kubernetes.io/projected/c7625aea-4beb-4a38-907f-9a757858ca0e-kube-api-access-jnr8d\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.378157 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.377981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7625aea-4beb-4a38-907f-9a757858ca0e-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.378157 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.378018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7625aea-4beb-4a38-907f-9a757858ca0e-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.378157 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:49:05.378020 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-serving-cert: secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 25 00:49:05.378157 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:49:05.378109 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls podName:c7625aea-4beb-4a38-907f-9a757858ca0e nodeName:}" failed. No retries permitted until 2026-04-25 00:49:05.878085528 +0000 UTC m=+3306.159202440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls") pod "xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" (UID: "c7625aea-4beb-4a38-907f-9a757858ca0e") : secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 25 00:49:05.378416 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.378397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7625aea-4beb-4a38-907f-9a757858ca0e-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.378695 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.378673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7625aea-4beb-4a38-907f-9a757858ca0e-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.386906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.386874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnr8d\" (UniqueName: \"kubernetes.io/projected/c7625aea-4beb-4a38-907f-9a757858ca0e-kube-api-access-jnr8d\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.881887 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.881840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:05.884503 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:05.884478 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-f2fl9\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:06.075607 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:06.075562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:06.109006 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:06.108975 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8491019-8b5c-4128-acf7-92e64fd24250" containerID="475a7b566f34c3fc29ceee907725f2b95c14a55a60e200795ecb144db6c48137" exitCode=2 Apr 25 00:49:06.109383 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:06.109019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerDied","Data":"475a7b566f34c3fc29ceee907725f2b95c14a55a60e200795ecb144db6c48137"} Apr 25 00:49:06.199837 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:06.199804 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9"] Apr 25 00:49:07.113289 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:07.113251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerStarted","Data":"7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21"} Apr 25 00:49:07.113289 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:07.113288 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerStarted","Data":"b3ead73223ba96db415c369e0faefc2b6528caf3cdf428c6b83218a07717b021"} Apr 25 00:49:07.983477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:07.983431 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.57:8643/healthz\": dial tcp 10.134.0.57:8643: connect: connection refused" Apr 25 00:49:07.989907 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:07.989878 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.57:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.57:8080: connect: connection refused" Apr 25 00:49:11.126717 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.126684 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8491019-8b5c-4128-acf7-92e64fd24250" containerID="1a321231077007fa3b0cd768db5f175c964f2f76eeb1481bd23014cb6a4cc20b" exitCode=0 Apr 25 00:49:11.127106 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.126762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerDied","Data":"1a321231077007fa3b0cd768db5f175c964f2f76eeb1481bd23014cb6a4cc20b"} Apr 25 00:49:11.127106 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.126802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" event={"ID":"c8491019-8b5c-4128-acf7-92e64fd24250","Type":"ContainerDied","Data":"9de62ac68bd3a83e4ab8526f0acefd69bf3fb06a7f31fbe4a772d3f4c296a34e"} Apr 25 00:49:11.127106 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.126813 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de62ac68bd3a83e4ab8526f0acefd69bf3fb06a7f31fbe4a772d3f4c296a34e" Apr 25 00:49:11.127662 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.127642 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:49:11.127996 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.127978 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerID="7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21" exitCode=0 Apr 25 00:49:11.128080 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.128044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerDied","Data":"7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21"} Apr 25 00:49:11.224600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.224574 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8491019-8b5c-4128-acf7-92e64fd24250-kserve-provision-location\") pod \"c8491019-8b5c-4128-acf7-92e64fd24250\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " Apr 25 00:49:11.224712 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.224612 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpw5f\" (UniqueName: \"kubernetes.io/projected/c8491019-8b5c-4128-acf7-92e64fd24250-kube-api-access-fpw5f\") pod \"c8491019-8b5c-4128-acf7-92e64fd24250\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " Apr 25 00:49:11.224712 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.224656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8491019-8b5c-4128-acf7-92e64fd24250-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"c8491019-8b5c-4128-acf7-92e64fd24250\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " Apr 25 00:49:11.224834 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.224733 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls\") pod \"c8491019-8b5c-4128-acf7-92e64fd24250\" (UID: \"c8491019-8b5c-4128-acf7-92e64fd24250\") " Apr 25 00:49:11.224987 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.224961 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8491019-8b5c-4128-acf7-92e64fd24250-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c8491019-8b5c-4128-acf7-92e64fd24250" (UID: "c8491019-8b5c-4128-acf7-92e64fd24250"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:49:11.225117 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.225014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8491019-8b5c-4128-acf7-92e64fd24250-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "c8491019-8b5c-4128-acf7-92e64fd24250" (UID: "c8491019-8b5c-4128-acf7-92e64fd24250"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:49:11.225117 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.225032 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8491019-8b5c-4128-acf7-92e64fd24250-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:49:11.226688 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.226669 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8491019-8b5c-4128-acf7-92e64fd24250-kube-api-access-fpw5f" (OuterVolumeSpecName: "kube-api-access-fpw5f") pod "c8491019-8b5c-4128-acf7-92e64fd24250" (UID: "c8491019-8b5c-4128-acf7-92e64fd24250"). InnerVolumeSpecName "kube-api-access-fpw5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:49:11.226825 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.226801 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c8491019-8b5c-4128-acf7-92e64fd24250" (UID: "c8491019-8b5c-4128-acf7-92e64fd24250"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:49:11.325673 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.325642 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8491019-8b5c-4128-acf7-92e64fd24250-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:49:11.325673 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.325666 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpw5f\" (UniqueName: \"kubernetes.io/projected/c8491019-8b5c-4128-acf7-92e64fd24250-kube-api-access-fpw5f\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:49:11.325673 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:11.325676 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8491019-8b5c-4128-acf7-92e64fd24250-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:49:12.132946 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.132894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerStarted","Data":"31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69"} Apr 25 00:49:12.132946 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.132939 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr" Apr 25 00:49:12.132946 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.132953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerStarted","Data":"cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb"} Apr 25 00:49:12.133470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.133334 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:12.133470 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.133355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:12.154653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.154615 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" podStartSLOduration=7.154605297 podStartE2EDuration="7.154605297s" podCreationTimestamp="2026-04-25 00:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:49:12.15335216 +0000 UTC m=+3312.434469080" watchObservedRunningTime="2026-04-25 00:49:12.154605297 +0000 UTC m=+3312.435722217" Apr 25 00:49:12.164945 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.164903 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr"] Apr 25 00:49:12.168633 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.168611 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-gcbrr"] Apr 25 00:49:12.315927 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:12.315880 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" path="/var/lib/kubelet/pods/c8491019-8b5c-4128-acf7-92e64fd24250/volumes" Apr 25 00:49:18.141436 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:18.141406 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:37.171546 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:37.171511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:49:37.178151 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:37.178124 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:49:48.145473 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:48.145442 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:49:55.236243 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.236207 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9"] Apr 25 00:49:55.236647 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.236618 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kserve-container" containerID="cri-o://cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb" gracePeriod=30 Apr 25 00:49:55.236779 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.236718 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kube-rbac-proxy" containerID="cri-o://31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69" gracePeriod=30 Apr 25 00:49:55.327702 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.327668 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5"] Apr 25 00:49:55.328076 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328059 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="storage-initializer" Apr 25 00:49:55.328184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328078 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="storage-initializer" Apr 25 00:49:55.328184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328094 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kube-rbac-proxy" Apr 25 00:49:55.328184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328103 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kube-rbac-proxy" Apr 25 00:49:55.328184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328125 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kserve-container" Apr 25 00:49:55.328184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328134 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kserve-container" Apr 25 00:49:55.328441 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328196 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kserve-container" Apr 25 00:49:55.328441 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.328208 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8491019-8b5c-4128-acf7-92e64fd24250" containerName="kube-rbac-proxy" Apr 25 00:49:55.331697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.331676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.333798 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.333769 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:49:55.333798 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.333783 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 25 00:49:55.339213 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.339179 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5"] Apr 25 00:49:55.343204 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.343182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.343334 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.343229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.343334 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.343300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.343334 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.343331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dknbk\" (UniqueName: \"kubernetes.io/projected/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kube-api-access-dknbk\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.443998 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.443963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.444188 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.444021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.444188 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.444048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dknbk\" (UniqueName: \"kubernetes.io/projected/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kube-api-access-dknbk\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.444188 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.444075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.444336 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:49:55.444204 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-serving-cert: secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 25 00:49:55.444336 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:49:55.444270 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls podName:1acaf6dd-04d5-4297-8f72-a78ee7cd0490 nodeName:}" failed. No retries permitted until 2026-04-25 00:49:55.944249174 +0000 UTC m=+3356.225366073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls") pod "isvc-xgboost-runtime-predictor-779db84d9-pq4h5" (UID: "1acaf6dd-04d5-4297-8f72-a78ee7cd0490") : secret "isvc-xgboost-runtime-predictor-serving-cert" not found Apr 25 00:49:55.444496 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.444472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.444670 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.444648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.452790 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.452771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dknbk\" (UniqueName: \"kubernetes.io/projected/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kube-api-access-dknbk\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.948624 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.948591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:55.951035 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:55.951013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-pq4h5\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:56.242794 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:56.242715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:49:56.264728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:56.264698 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerID="31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69" exitCode=2 Apr 25 00:49:56.264853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:56.264776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerDied","Data":"31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69"} Apr 25 00:49:56.363816 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:56.363779 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5"] Apr 25 00:49:56.367199 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:49:56.367173 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1acaf6dd_04d5_4297_8f72_a78ee7cd0490.slice/crio-37a77263728d77b9443d74d0db5804c62ced36253243a24c0870cf1fbff60587 WatchSource:0}: Error finding container 37a77263728d77b9443d74d0db5804c62ced36253243a24c0870cf1fbff60587: Status 404 returned error can't find the container with id 37a77263728d77b9443d74d0db5804c62ced36253243a24c0870cf1fbff60587 Apr 25 00:49:57.268965 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:57.268908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerStarted","Data":"c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f"} Apr 25 00:49:57.268965 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:57.268967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerStarted","Data":"37a77263728d77b9443d74d0db5804c62ced36253243a24c0870cf1fbff60587"} Apr 25 00:49:58.137042 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:58.137000 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.58:8643/healthz\": dial tcp 10.134.0.58:8643: connect: connection refused" Apr 25 00:49:58.142869 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:49:58.142847 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 25 00:50:00.279220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:00.279184 2576 generic.go:358] "Generic (PLEG): container finished" podID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerID="c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f" exitCode=0 Apr 25 00:50:00.279627 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:00.279252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerDied","Data":"c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f"} Apr 25 00:50:01.284757 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.284719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerStarted","Data":"51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf"} Apr 25 00:50:01.284757 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.284763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerStarted","Data":"51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae"} Apr 25 00:50:01.285265 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.285063 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:50:01.285265 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.285201 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:50:01.286464 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.286428 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:01.303723 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.303685 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podStartSLOduration=6.303673575 podStartE2EDuration="6.303673575s" podCreationTimestamp="2026-04-25 00:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:50:01.301716413 +0000 UTC m=+3361.582833355" watchObservedRunningTime="2026-04-25 00:50:01.303673575 +0000 UTC m=+3361.584790494" Apr 25 00:50:01.607128 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.607106 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:50:01.685809 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.685780 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls\") pod \"c7625aea-4beb-4a38-907f-9a757858ca0e\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " Apr 25 00:50:01.685998 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.685818 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7625aea-4beb-4a38-907f-9a757858ca0e-kserve-provision-location\") pod \"c7625aea-4beb-4a38-907f-9a757858ca0e\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " Apr 25 00:50:01.685998 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.685848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7625aea-4beb-4a38-907f-9a757858ca0e-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"c7625aea-4beb-4a38-907f-9a757858ca0e\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " Apr 25 00:50:01.686105 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.685991 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnr8d\" (UniqueName: \"kubernetes.io/projected/c7625aea-4beb-4a38-907f-9a757858ca0e-kube-api-access-jnr8d\") pod \"c7625aea-4beb-4a38-907f-9a757858ca0e\" (UID: \"c7625aea-4beb-4a38-907f-9a757858ca0e\") " Apr 25 00:50:01.686233 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.686208 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7625aea-4beb-4a38-907f-9a757858ca0e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7625aea-4beb-4a38-907f-9a757858ca0e" (UID: "c7625aea-4beb-4a38-907f-9a757858ca0e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:50:01.686233 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.686223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7625aea-4beb-4a38-907f-9a757858ca0e-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "c7625aea-4beb-4a38-907f-9a757858ca0e" (UID: "c7625aea-4beb-4a38-907f-9a757858ca0e"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:50:01.688122 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.688103 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7625aea-4beb-4a38-907f-9a757858ca0e" (UID: "c7625aea-4beb-4a38-907f-9a757858ca0e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:50:01.688233 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.688212 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7625aea-4beb-4a38-907f-9a757858ca0e-kube-api-access-jnr8d" (OuterVolumeSpecName: "kube-api-access-jnr8d") pod "c7625aea-4beb-4a38-907f-9a757858ca0e" (UID: "c7625aea-4beb-4a38-907f-9a757858ca0e"). InnerVolumeSpecName "kube-api-access-jnr8d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:50:01.786792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.786764 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7625aea-4beb-4a38-907f-9a757858ca0e-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:50:01.786792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.786789 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7625aea-4beb-4a38-907f-9a757858ca0e-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:50:01.786969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.786800 2576 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7625aea-4beb-4a38-907f-9a757858ca0e-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:50:01.786969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:01.786811 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jnr8d\" (UniqueName: \"kubernetes.io/projected/c7625aea-4beb-4a38-907f-9a757858ca0e-kube-api-access-jnr8d\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:50:02.288810 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.288772 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerID="cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb" exitCode=0 Apr 25 00:50:02.289324 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.288862 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" Apr 25 00:50:02.289324 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.288896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerDied","Data":"cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb"} Apr 25 00:50:02.289324 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.288961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9" event={"ID":"c7625aea-4beb-4a38-907f-9a757858ca0e","Type":"ContainerDied","Data":"b3ead73223ba96db415c369e0faefc2b6528caf3cdf428c6b83218a07717b021"} Apr 25 00:50:02.289324 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.288981 2576 scope.go:117] "RemoveContainer" containerID="31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69" Apr 25 00:50:02.289708 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.289671 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:02.297843 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.297827 2576 scope.go:117] "RemoveContainer" containerID="cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb" Apr 25 00:50:02.305044 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.305029 2576 scope.go:117] "RemoveContainer" containerID="7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21" Apr 25 00:50:02.309821 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.309799 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9"] Apr 25 00:50:02.312345 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.312086 2576 scope.go:117] "RemoveContainer" containerID="31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69" Apr 25 00:50:02.312468 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:50:02.312428 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69\": container with ID starting with 31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69 not found: ID does not exist" containerID="31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69" Apr 25 00:50:02.312536 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.312461 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69"} err="failed to get container status \"31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69\": rpc error: code = NotFound desc = could not find container \"31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69\": container with ID starting with 31d7514b0b8056283ac6238232e27264c1e39daae46e44c18f3f74c3eb580e69 not found: ID does not exist" Apr 25 00:50:02.312536 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.312483 2576 scope.go:117] "RemoveContainer" containerID="cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb" Apr 25 00:50:02.312772 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:50:02.312752 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb\": container with ID starting with cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb not found: ID does not exist" containerID="cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb" Apr 25 00:50:02.312839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.312789 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb"} err="failed to get container status \"cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb\": rpc error: code = NotFound desc = could not find container \"cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb\": container with ID starting with cefc2d0728742ee7624afb0f88c602bb8c9c6233c51e8504918feac1201d54fb not found: ID does not exist" Apr 25 00:50:02.312839 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.312806 2576 scope.go:117] "RemoveContainer" containerID="7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21" Apr 25 00:50:02.313470 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:50:02.313447 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21\": container with ID starting with 7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21 not found: ID does not exist" containerID="7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21" Apr 25 00:50:02.313551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.313475 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21"} err="failed to get container status \"7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21\": rpc error: code = NotFound desc = could not find container \"7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21\": container with ID starting with 7c27ff0d4190d9090825826fd71e42391678c51742cacd958404fa4450574f21 not found: ID does not exist" Apr 25 00:50:02.316111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:02.316094 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-f2fl9"] Apr 25 00:50:04.315231 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:04.315201 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" path="/var/lib/kubelet/pods/c7625aea-4beb-4a38-907f-9a757858ca0e/volumes" Apr 25 00:50:07.293389 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:07.293360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:50:07.293853 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:07.293829 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:17.293805 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:17.293767 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:27.294500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:27.294460 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:37.293885 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:37.293838 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:47.294762 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:47.294718 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:50:57.295090 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:50:57.295053 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:51:02.235314 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:02.235281 2576 scope.go:117] "RemoveContainer" containerID="ca63627b1755b2b72fc440a43c143c0019bec302b87135dd556aed80413e881c" Apr 25 00:51:05.401962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.401930 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5"] Apr 25 00:51:05.402431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.402235 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" containerID="cri-o://51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae" gracePeriod=30 Apr 25 00:51:05.402573 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.402294 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kube-rbac-proxy" containerID="cri-o://51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf" gracePeriod=30 Apr 25 00:51:05.466746 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.466705 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g"] Apr 25 00:51:05.467013 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.466999 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="storage-initializer" Apr 25 00:51:05.467013 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467013 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="storage-initializer" Apr 25 00:51:05.467134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467029 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kube-rbac-proxy" Apr 25 00:51:05.467134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467035 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kube-rbac-proxy" Apr 25 00:51:05.467134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467045 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kserve-container" Apr 25 00:51:05.467134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467050 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kserve-container" Apr 25 00:51:05.467134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467091 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kserve-container" Apr 25 00:51:05.467134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.467100 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7625aea-4beb-4a38-907f-9a757858ca0e" containerName="kube-rbac-proxy" Apr 25 00:51:05.471310 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.471293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.473826 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.473798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 25 00:51:05.473826 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.473814 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 25 00:51:05.478882 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.478857 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g"] Apr 25 00:51:05.538397 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.538365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26f7a865-9010-47e4-85b4-d723fe4a974d-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.538544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.538442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26f7a865-9010-47e4-85b4-d723fe4a974d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.538544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.538468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26f7a865-9010-47e4-85b4-d723fe4a974d-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.538544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.538515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7krjj\" (UniqueName: \"kubernetes.io/projected/26f7a865-9010-47e4-85b4-d723fe4a974d-kube-api-access-7krjj\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.639417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.639384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7krjj\" (UniqueName: \"kubernetes.io/projected/26f7a865-9010-47e4-85b4-d723fe4a974d-kube-api-access-7krjj\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.639600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.639435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26f7a865-9010-47e4-85b4-d723fe4a974d-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.639600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.639491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26f7a865-9010-47e4-85b4-d723fe4a974d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.639600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.639517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26f7a865-9010-47e4-85b4-d723fe4a974d-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.639905 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.639879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26f7a865-9010-47e4-85b4-d723fe4a974d-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.640210 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.640189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26f7a865-9010-47e4-85b4-d723fe4a974d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.642058 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.642041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26f7a865-9010-47e4-85b4-d723fe4a974d-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.647286 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.647250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7krjj\" (UniqueName: \"kubernetes.io/projected/26f7a865-9010-47e4-85b4-d723fe4a974d-kube-api-access-7krjj\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.782510 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.782487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:05.902666 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:05.902613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g"] Apr 25 00:51:05.904817 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:51:05.904789 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f7a865_9010_47e4_85b4_d723fe4a974d.slice/crio-6cf0347adc5cbc2e091030b527f336130fe642e18c462d92072107288486d466 WatchSource:0}: Error finding container 6cf0347adc5cbc2e091030b527f336130fe642e18c462d92072107288486d466: Status 404 returned error can't find the container with id 6cf0347adc5cbc2e091030b527f336130fe642e18c462d92072107288486d466 Apr 25 00:51:06.472174 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:06.472135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerStarted","Data":"9050a997899ce6f95d46f6db8410b1a8569aa6e3ad485c06d8fb1b02fbd4d1f3"} Apr 25 00:51:06.472174 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:06.472177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerStarted","Data":"6cf0347adc5cbc2e091030b527f336130fe642e18c462d92072107288486d466"} Apr 25 00:51:06.474124 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:06.474087 2576 generic.go:358] "Generic (PLEG): container finished" podID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerID="51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf" exitCode=2 Apr 25 00:51:06.474235 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:06.474156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerDied","Data":"51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf"} Apr 25 00:51:07.290409 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:07.290367 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.59:8643/healthz\": dial tcp 10.134.0.59:8643: connect: connection refused" Apr 25 00:51:07.294653 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:07.294629 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 25 00:51:08.741949 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.741903 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:51:08.862243 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.862161 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dknbk\" (UniqueName: \"kubernetes.io/projected/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kube-api-access-dknbk\") pod \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " Apr 25 00:51:08.862243 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.862206 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls\") pod \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " Apr 25 00:51:08.862431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.862252 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kserve-provision-location\") pod \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " Apr 25 00:51:08.862431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.862279 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\" (UID: \"1acaf6dd-04d5-4297-8f72-a78ee7cd0490\") " Apr 25 00:51:08.862621 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.862591 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1acaf6dd-04d5-4297-8f72-a78ee7cd0490" (UID: "1acaf6dd-04d5-4297-8f72-a78ee7cd0490"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:51:08.862742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.862596 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "1acaf6dd-04d5-4297-8f72-a78ee7cd0490" (UID: "1acaf6dd-04d5-4297-8f72-a78ee7cd0490"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:51:08.864491 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.864469 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1acaf6dd-04d5-4297-8f72-a78ee7cd0490" (UID: "1acaf6dd-04d5-4297-8f72-a78ee7cd0490"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:51:08.864586 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.864488 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kube-api-access-dknbk" (OuterVolumeSpecName: "kube-api-access-dknbk") pod "1acaf6dd-04d5-4297-8f72-a78ee7cd0490" (UID: "1acaf6dd-04d5-4297-8f72-a78ee7cd0490"). InnerVolumeSpecName "kube-api-access-dknbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:51:08.963592 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.963556 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:51:08.963592 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.963588 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:51:08.963592 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.963599 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dknbk\" (UniqueName: \"kubernetes.io/projected/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-kube-api-access-dknbk\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:51:08.963828 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:08.963610 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1acaf6dd-04d5-4297-8f72-a78ee7cd0490-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:51:09.484172 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.484140 2576 generic.go:358] "Generic (PLEG): container finished" podID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerID="51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae" exitCode=0 Apr 25 00:51:09.484337 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.484199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerDied","Data":"51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae"} Apr 25 00:51:09.484337 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.484220 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" Apr 25 00:51:09.484337 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.484235 2576 scope.go:117] "RemoveContainer" containerID="51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf" Apr 25 00:51:09.484458 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.484225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5" event={"ID":"1acaf6dd-04d5-4297-8f72-a78ee7cd0490","Type":"ContainerDied","Data":"37a77263728d77b9443d74d0db5804c62ced36253243a24c0870cf1fbff60587"} Apr 25 00:51:09.492797 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.492781 2576 scope.go:117] "RemoveContainer" containerID="51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae" Apr 25 00:51:09.500485 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.500461 2576 scope.go:117] "RemoveContainer" containerID="c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f" Apr 25 00:51:09.506514 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.506487 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5"] Apr 25 00:51:09.508671 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.508646 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-pq4h5"] Apr 25 00:51:09.509768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.509752 2576 scope.go:117] "RemoveContainer" containerID="51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf" Apr 25 00:51:09.510062 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:51:09.510042 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf\": container with ID starting with 51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf not found: ID does not exist" containerID="51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf" Apr 25 00:51:09.510161 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.510072 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf"} err="failed to get container status \"51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf\": rpc error: code = NotFound desc = could not find container \"51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf\": container with ID starting with 51a063090421f5f96ce6e83810b58744f6cfd4d9c27ecc814160911dcf8ca0bf not found: ID does not exist" Apr 25 00:51:09.510161 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.510090 2576 scope.go:117] "RemoveContainer" containerID="51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae" Apr 25 00:51:09.510357 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:51:09.510338 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae\": container with ID starting with 51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae not found: ID does not exist" containerID="51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae" Apr 25 00:51:09.510397 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.510364 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae"} err="failed to get container status \"51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae\": rpc error: code = NotFound desc = could not find container \"51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae\": container with ID starting with 51e03b9aeeee0ce6cd97a1cefd7f78b6ab9601b7a417f2e8b980f194fd2541ae not found: ID does not exist" Apr 25 00:51:09.510397 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.510382 2576 scope.go:117] "RemoveContainer" containerID="c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f" Apr 25 00:51:09.510629 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:51:09.510607 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f\": container with ID starting with c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f not found: ID does not exist" containerID="c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f" Apr 25 00:51:09.510680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:09.510635 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f"} err="failed to get container status \"c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f\": rpc error: code = NotFound desc = could not find container \"c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f\": container with ID starting with c0c2f05ae67742d57d18d73f196a81c5b42af9163fd2d9cb7573fddb2b96cc6f not found: ID does not exist" Apr 25 00:51:10.315726 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:10.315690 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" path="/var/lib/kubelet/pods/1acaf6dd-04d5-4297-8f72-a78ee7cd0490/volumes" Apr 25 00:51:10.488003 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:10.487969 2576 generic.go:358] "Generic (PLEG): container finished" podID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerID="9050a997899ce6f95d46f6db8410b1a8569aa6e3ad485c06d8fb1b02fbd4d1f3" exitCode=0 Apr 25 00:51:10.488186 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:10.488044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerDied","Data":"9050a997899ce6f95d46f6db8410b1a8569aa6e3ad485c06d8fb1b02fbd4d1f3"} Apr 25 00:51:11.493137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:11.493103 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerStarted","Data":"940f1cd26342eba3f9648876fc60b044d550b0784d27f9800732e49778b3342b"} Apr 25 00:51:11.493137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:11.493142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerStarted","Data":"e458a631df64b4fd8e53b6992b39dbd69fc6d4becabd67b79242e7f9e7326776"} Apr 25 00:51:11.493535 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:11.493361 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:11.493535 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:11.493412 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:11.513242 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:11.513201 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podStartSLOduration=6.513186133 podStartE2EDuration="6.513186133s" podCreationTimestamp="2026-04-25 00:51:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:51:11.512062793 +0000 UTC m=+3431.793179713" watchObservedRunningTime="2026-04-25 00:51:11.513186133 +0000 UTC m=+3431.794303053" Apr 25 00:51:17.501418 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:17.501394 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:51:47.579984 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:47.579945 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 25 00:51:57.504268 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:51:57.504239 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:52:05.556937 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.556885 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g"] Apr 25 00:52:05.557409 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.557308 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kserve-container" containerID="cri-o://e458a631df64b4fd8e53b6992b39dbd69fc6d4becabd67b79242e7f9e7326776" gracePeriod=30 Apr 25 00:52:05.557478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.557397 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kube-rbac-proxy" containerID="cri-o://940f1cd26342eba3f9648876fc60b044d550b0784d27f9800732e49778b3342b" gracePeriod=30 Apr 25 00:52:05.640710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.640672 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk"] Apr 25 00:52:05.640988 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.640975 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="storage-initializer" Apr 25 00:52:05.641041 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.640989 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="storage-initializer" Apr 25 00:52:05.641041 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.640996 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kube-rbac-proxy" Apr 25 00:52:05.641041 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.641002 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kube-rbac-proxy" Apr 25 00:52:05.641041 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.641010 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" Apr 25 00:52:05.641041 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.641016 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" Apr 25 00:52:05.641200 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.641060 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kserve-container" Apr 25 00:52:05.641200 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.641067 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1acaf6dd-04d5-4297-8f72-a78ee7cd0490" containerName="kube-rbac-proxy" Apr 25 00:52:05.645181 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.645160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.647380 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.647354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 25 00:52:05.647538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.647520 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 25 00:52:05.654124 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.654101 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk"] Apr 25 00:52:05.678417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.678387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.678544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.678425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.678544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.678447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svp8\" (UniqueName: \"kubernetes.io/projected/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kube-api-access-7svp8\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.678663 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.678532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.779764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.779722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.779764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.779758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.780074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.779782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7svp8\" (UniqueName: \"kubernetes.io/projected/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kube-api-access-7svp8\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.780074 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.779831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.780074 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:52:05.779876 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 25 00:52:05.780074 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:52:05.779971 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls podName:153bab5f-88b6-4b0e-bf7f-40ac12db9ced nodeName:}" failed. No retries permitted until 2026-04-25 00:52:06.279951108 +0000 UTC m=+3486.561068009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" (UID: "153bab5f-88b6-4b0e-bf7f-40ac12db9ced") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 25 00:52:05.780340 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.780241 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.780472 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.780451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:05.788215 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:05.788195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svp8\" (UniqueName: \"kubernetes.io/projected/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kube-api-access-7svp8\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:06.283425 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:06.283392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:06.285969 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:06.285943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:06.555703 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:06.555608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:06.661837 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:06.661808 2576 generic.go:358] "Generic (PLEG): container finished" podID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerID="940f1cd26342eba3f9648876fc60b044d550b0784d27f9800732e49778b3342b" exitCode=2 Apr 25 00:52:06.662207 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:06.661857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerDied","Data":"940f1cd26342eba3f9648876fc60b044d550b0784d27f9800732e49778b3342b"} Apr 25 00:52:06.674959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:06.674937 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk"] Apr 25 00:52:06.677613 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:52:06.677589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153bab5f_88b6_4b0e_bf7f_40ac12db9ced.slice/crio-0a9b71e2ccf99f795c7c3e784060260cd61dddc94d975438fc4736ce73f83515 WatchSource:0}: Error finding container 0a9b71e2ccf99f795c7c3e784060260cd61dddc94d975438fc4736ce73f83515: Status 404 returned error can't find the container with id 0a9b71e2ccf99f795c7c3e784060260cd61dddc94d975438fc4736ce73f83515 Apr 25 00:52:07.496591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:07.496550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.60:8643/healthz\": dial tcp 10.134.0.60:8643: connect: connection refused" Apr 25 00:52:07.666500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:07.666457 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerStarted","Data":"2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0"} Apr 25 00:52:07.666500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:07.666493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerStarted","Data":"0a9b71e2ccf99f795c7c3e784060260cd61dddc94d975438fc4736ce73f83515"} Apr 25 00:52:08.543152 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:08.543102 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.60:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 25 00:52:10.678383 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:10.678327 2576 generic.go:358] "Generic (PLEG): container finished" podID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerID="2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0" exitCode=0 Apr 25 00:52:10.678767 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:10.678391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerDied","Data":"2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0"} Apr 25 00:52:11.684388 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:11.684355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerStarted","Data":"ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03"} Apr 25 00:52:11.684858 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:11.684401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerStarted","Data":"6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d"} Apr 25 00:52:11.684858 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:11.684631 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:11.701529 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:11.701482 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podStartSLOduration=6.701466732 podStartE2EDuration="6.701466732s" podCreationTimestamp="2026-04-25 00:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:52:11.701079381 +0000 UTC m=+3491.982196301" watchObservedRunningTime="2026-04-25 00:52:11.701466732 +0000 UTC m=+3491.982583650" Apr 25 00:52:12.497288 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.497253 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.60:8643/healthz\": dial tcp 10.134.0.60:8643: connect: connection refused" Apr 25 00:52:12.689449 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.689415 2576 generic.go:358] "Generic (PLEG): container finished" podID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerID="e458a631df64b4fd8e53b6992b39dbd69fc6d4becabd67b79242e7f9e7326776" exitCode=0 Apr 25 00:52:12.689449 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.689443 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:52:12.690000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.689476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerDied","Data":"e458a631df64b4fd8e53b6992b39dbd69fc6d4becabd67b79242e7f9e7326776"} Apr 25 00:52:12.690000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.689510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" event={"ID":"26f7a865-9010-47e4-85b4-d723fe4a974d","Type":"ContainerDied","Data":"6cf0347adc5cbc2e091030b527f336130fe642e18c462d92072107288486d466"} Apr 25 00:52:12.690000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.689532 2576 scope.go:117] "RemoveContainer" containerID="940f1cd26342eba3f9648876fc60b044d550b0784d27f9800732e49778b3342b" Apr 25 00:52:12.690000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.689817 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:12.690884 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.690854 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:52:12.697435 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.697414 2576 scope.go:117] "RemoveContainer" containerID="e458a631df64b4fd8e53b6992b39dbd69fc6d4becabd67b79242e7f9e7326776" Apr 25 00:52:12.706489 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.706398 2576 scope.go:117] "RemoveContainer" containerID="9050a997899ce6f95d46f6db8410b1a8569aa6e3ad485c06d8fb1b02fbd4d1f3" Apr 25 00:52:12.730890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.730843 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26f7a865-9010-47e4-85b4-d723fe4a974d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"26f7a865-9010-47e4-85b4-d723fe4a974d\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " Apr 25 00:52:12.730890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.730884 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7krjj\" (UniqueName: \"kubernetes.io/projected/26f7a865-9010-47e4-85b4-d723fe4a974d-kube-api-access-7krjj\") pod \"26f7a865-9010-47e4-85b4-d723fe4a974d\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " Apr 25 00:52:12.731016 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.730989 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26f7a865-9010-47e4-85b4-d723fe4a974d-kserve-provision-location\") pod \"26f7a865-9010-47e4-85b4-d723fe4a974d\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " Apr 25 00:52:12.731059 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.731013 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26f7a865-9010-47e4-85b4-d723fe4a974d-proxy-tls\") pod \"26f7a865-9010-47e4-85b4-d723fe4a974d\" (UID: \"26f7a865-9010-47e4-85b4-d723fe4a974d\") " Apr 25 00:52:12.731270 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.731151 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f7a865-9010-47e4-85b4-d723fe4a974d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "26f7a865-9010-47e4-85b4-d723fe4a974d" (UID: "26f7a865-9010-47e4-85b4-d723fe4a974d"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:52:12.731386 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.731268 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26f7a865-9010-47e4-85b4-d723fe4a974d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26f7a865-9010-47e4-85b4-d723fe4a974d" (UID: "26f7a865-9010-47e4-85b4-d723fe4a974d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:52:12.732972 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.732948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f7a865-9010-47e4-85b4-d723fe4a974d-kube-api-access-7krjj" (OuterVolumeSpecName: "kube-api-access-7krjj") pod "26f7a865-9010-47e4-85b4-d723fe4a974d" (UID: "26f7a865-9010-47e4-85b4-d723fe4a974d"). InnerVolumeSpecName "kube-api-access-7krjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:52:12.733124 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.733095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f7a865-9010-47e4-85b4-d723fe4a974d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "26f7a865-9010-47e4-85b4-d723fe4a974d" (UID: "26f7a865-9010-47e4-85b4-d723fe4a974d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:52:12.831735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.831700 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26f7a865-9010-47e4-85b4-d723fe4a974d-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:52:12.831735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.831728 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26f7a865-9010-47e4-85b4-d723fe4a974d-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:52:12.831735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.831739 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26f7a865-9010-47e4-85b4-d723fe4a974d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:52:12.832000 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:12.831748 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7krjj\" (UniqueName: \"kubernetes.io/projected/26f7a865-9010-47e4-85b4-d723fe4a974d-kube-api-access-7krjj\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:52:13.698007 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:13.697978 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g" Apr 25 00:52:13.698516 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:13.698464 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:52:13.722202 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:13.722174 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g"] Apr 25 00:52:13.724680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:13.724659 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-wvg2g"] Apr 25 00:52:14.315803 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:14.315772 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" path="/var/lib/kubelet/pods/26f7a865-9010-47e4-85b4-d723fe4a974d/volumes" Apr 25 00:52:18.702024 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:18.701996 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:52:18.702630 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:18.702606 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:52:28.702532 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:28.702489 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:52:38.703173 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:38.703129 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:52:48.703261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:48.703214 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:52:58.703479 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:52:58.703431 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:53:02.258348 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:02.258312 2576 scope.go:117] "RemoveContainer" containerID="bec0ec98c814c5683d8af48adfdf6a7d303fe8c36a32cbe1271de24f8f1f0346" Apr 25 00:53:02.266045 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:02.266025 2576 scope.go:117] "RemoveContainer" containerID="ad8c4fdcaa8b4a090a9693c81582654da1f9aef82fa4f5e030a490693fe1d8ac" Apr 25 00:53:08.703079 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:08.703047 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:53:15.731742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.731708 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk"] Apr 25 00:53:15.732175 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.732010 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" containerID="cri-o://6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d" gracePeriod=30 Apr 25 00:53:15.732175 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.732073 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kube-rbac-proxy" containerID="cri-o://ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03" gracePeriod=30 Apr 25 00:53:15.810221 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810188 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p"] Apr 25 00:53:15.810462 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810450 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="storage-initializer" Apr 25 00:53:15.810511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810464 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="storage-initializer" Apr 25 00:53:15.810511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810476 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kserve-container" Apr 25 00:53:15.810511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810481 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kserve-container" Apr 25 00:53:15.810511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810489 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kube-rbac-proxy" Apr 25 00:53:15.810511 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810495 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kube-rbac-proxy" Apr 25 00:53:15.810686 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810539 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kube-rbac-proxy" Apr 25 00:53:15.810686 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.810548 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26f7a865-9010-47e4-85b4-d723fe4a974d" containerName="kserve-container" Apr 25 00:53:15.815116 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.815095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.817212 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.817178 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 25 00:53:15.817336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.817224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 25 00:53:15.817469 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.817456 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 25 00:53:15.823959 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.823906 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p"] Apr 25 00:53:15.879737 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.879713 2576 generic.go:358] "Generic (PLEG): container finished" podID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerID="ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03" exitCode=2 Apr 25 00:53:15.879844 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.879786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerDied","Data":"ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03"} Apr 25 00:53:15.901593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.898282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.901593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.898359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33472dc9-805c-42f8-ae87-105a010b793a-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.901593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.898449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33472dc9-805c-42f8-ae87-105a010b793a-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.901593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.898515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csp2n\" (UniqueName: \"kubernetes.io/projected/33472dc9-805c-42f8-ae87-105a010b793a-kube-api-access-csp2n\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.999744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.999680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.999744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.999713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33472dc9-805c-42f8-ae87-105a010b793a-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.999744 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.999743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33472dc9-805c-42f8-ae87-105a010b793a-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.999993 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:15.999769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csp2n\" (UniqueName: \"kubernetes.io/projected/33472dc9-805c-42f8-ae87-105a010b793a-kube-api-access-csp2n\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:15.999993 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:53:15.999887 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 25 00:53:16.000109 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:53:16.000010 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls podName:33472dc9-805c-42f8-ae87-105a010b793a nodeName:}" failed. No retries permitted until 2026-04-25 00:53:16.499988383 +0000 UTC m=+3556.781105297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls") pod "isvc-sklearn-s3-predictor-68847956b-cl76p" (UID: "33472dc9-805c-42f8-ae87-105a010b793a") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 25 00:53:16.000210 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.000191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33472dc9-805c-42f8-ae87-105a010b793a-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:16.000477 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.000455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33472dc9-805c-42f8-ae87-105a010b793a-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:16.008326 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.008305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csp2n\" (UniqueName: \"kubernetes.io/projected/33472dc9-805c-42f8-ae87-105a010b793a-kube-api-access-csp2n\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:16.502377 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.502328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:16.504817 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.504798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls\") pod \"isvc-sklearn-s3-predictor-68847956b-cl76p\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:16.726354 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.726322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:16.843605 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.843581 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p"] Apr 25 00:53:16.845949 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:53:16.845902 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33472dc9_805c_42f8_ae87_105a010b793a.slice/crio-d9546f15266cf00e424fd07f416c546ffc31743633a263253807c9374e3f9fc2 WatchSource:0}: Error finding container d9546f15266cf00e424fd07f416c546ffc31743633a263253807c9374e3f9fc2: Status 404 returned error can't find the container with id d9546f15266cf00e424fd07f416c546ffc31743633a263253807c9374e3f9fc2 Apr 25 00:53:16.847809 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.847790 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:53:16.884406 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:16.884382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerStarted","Data":"d9546f15266cf00e424fd07f416c546ffc31743633a263253807c9374e3f9fc2"} Apr 25 00:53:17.889610 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:17.889568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerStarted","Data":"dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208"} Apr 25 00:53:18.698420 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:18.698378 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.61:8643/healthz\": dial tcp 10.134.0.61:8643: connect: connection refused" Apr 25 00:53:18.702669 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:18.702644 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 25 00:53:18.893952 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:18.893904 2576 generic.go:358] "Generic (PLEG): container finished" podID="33472dc9-805c-42f8-ae87-105a010b793a" containerID="dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208" exitCode=0 Apr 25 00:53:18.894334 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:18.893957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerDied","Data":"dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208"} Apr 25 00:53:19.275198 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.275177 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:53:19.321542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.321511 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7svp8\" (UniqueName: \"kubernetes.io/projected/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kube-api-access-7svp8\") pod \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " Apr 25 00:53:19.321690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.321559 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " Apr 25 00:53:19.321690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.321582 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kserve-provision-location\") pod \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " Apr 25 00:53:19.321690 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.321602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls\") pod \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\" (UID: \"153bab5f-88b6-4b0e-bf7f-40ac12db9ced\") " Apr 25 00:53:19.321980 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.321956 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "153bab5f-88b6-4b0e-bf7f-40ac12db9ced" (UID: "153bab5f-88b6-4b0e-bf7f-40ac12db9ced"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:53:19.322053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.321976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "153bab5f-88b6-4b0e-bf7f-40ac12db9ced" (UID: "153bab5f-88b6-4b0e-bf7f-40ac12db9ced"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:53:19.323764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.323746 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "153bab5f-88b6-4b0e-bf7f-40ac12db9ced" (UID: "153bab5f-88b6-4b0e-bf7f-40ac12db9ced"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:53:19.323832 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.323758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kube-api-access-7svp8" (OuterVolumeSpecName: "kube-api-access-7svp8") pod "153bab5f-88b6-4b0e-bf7f-40ac12db9ced" (UID: "153bab5f-88b6-4b0e-bf7f-40ac12db9ced"). InnerVolumeSpecName "kube-api-access-7svp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:53:19.422958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.422902 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:53:19.422958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.422953 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:53:19.422958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.422964 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7svp8\" (UniqueName: \"kubernetes.io/projected/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-kube-api-access-7svp8\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:53:19.422958 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.422975 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/153bab5f-88b6-4b0e-bf7f-40ac12db9ced-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:53:19.898936 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.898873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerStarted","Data":"123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933"} Apr 25 00:53:19.898936 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.898938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerStarted","Data":"c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70"} Apr 25 00:53:19.899387 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.899020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:19.900485 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.900459 2576 generic.go:358] "Generic (PLEG): container finished" podID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerID="6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d" exitCode=0 Apr 25 00:53:19.900619 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.900509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerDied","Data":"6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d"} Apr 25 00:53:19.900619 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.900524 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" Apr 25 00:53:19.900619 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.900537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk" event={"ID":"153bab5f-88b6-4b0e-bf7f-40ac12db9ced","Type":"ContainerDied","Data":"0a9b71e2ccf99f795c7c3e784060260cd61dddc94d975438fc4736ce73f83515"} Apr 25 00:53:19.900619 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.900558 2576 scope.go:117] "RemoveContainer" containerID="ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03" Apr 25 00:53:19.908662 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.908639 2576 scope.go:117] "RemoveContainer" containerID="6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d" Apr 25 00:53:19.915313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.915296 2576 scope.go:117] "RemoveContainer" containerID="2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0" Apr 25 00:53:19.918387 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.918350 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podStartSLOduration=4.918339207 podStartE2EDuration="4.918339207s" podCreationTimestamp="2026-04-25 00:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:53:19.9172608 +0000 UTC m=+3560.198377721" watchObservedRunningTime="2026-04-25 00:53:19.918339207 +0000 UTC m=+3560.199456169" Apr 25 00:53:19.922235 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.922218 2576 scope.go:117] "RemoveContainer" containerID="ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03" Apr 25 00:53:19.922457 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:53:19.922438 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03\": container with ID starting with ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03 not found: ID does not exist" containerID="ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03" Apr 25 00:53:19.922525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.922470 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03"} err="failed to get container status \"ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03\": rpc error: code = NotFound desc = could not find container \"ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03\": container with ID starting with ed854a1ecededf6bd5f0d92cc5111d2f9f65df0cda9a056c0e4c587e40cbec03 not found: ID does not exist" Apr 25 00:53:19.922525 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.922493 2576 scope.go:117] "RemoveContainer" containerID="6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d" Apr 25 00:53:19.922713 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:53:19.922697 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d\": container with ID starting with 6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d not found: ID does not exist" containerID="6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d" Apr 25 00:53:19.922753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.922721 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d"} err="failed to get container status \"6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d\": rpc error: code = NotFound desc = could not find container \"6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d\": container with ID starting with 6fbcd9157bf6008b3e89216d38397c0a2e07041bfd27f2012c31e315588d2e7d not found: ID does not exist" Apr 25 00:53:19.922753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.922739 2576 scope.go:117] "RemoveContainer" containerID="2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0" Apr 25 00:53:19.922947 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:53:19.922928 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0\": container with ID starting with 2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0 not found: ID does not exist" containerID="2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0" Apr 25 00:53:19.923038 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.922948 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0"} err="failed to get container status \"2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0\": rpc error: code = NotFound desc = could not find container \"2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0\": container with ID starting with 2d425c28ecc1820603174a2ab2ba6bbc2b94c5ebd024632db526d2067a0e58f0 not found: ID does not exist" Apr 25 00:53:19.929546 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.929527 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk"] Apr 25 00:53:19.933372 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:19.933352 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-ws6jk"] Apr 25 00:53:20.315227 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:20.315196 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" path="/var/lib/kubelet/pods/153bab5f-88b6-4b0e-bf7f-40ac12db9ced/volumes" Apr 25 00:53:20.903866 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:20.903827 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:20.905143 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:20.905116 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:53:21.906910 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:21.906869 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:53:26.911592 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:26.911563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:53:26.912220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:26.912192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:53:36.912512 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:36.912468 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:53:46.912708 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:46.912668 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:53:56.912497 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:53:56.912450 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:54:06.912520 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:06.912477 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:54:16.913017 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:16.912977 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:54:26.913091 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:26.913062 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:54:35.898211 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:35.898180 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p"] Apr 25 00:54:35.898587 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:35.898493 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" containerID="cri-o://c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70" gracePeriod=30 Apr 25 00:54:35.898587 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:35.898534 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kube-rbac-proxy" containerID="cri-o://123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933" gracePeriod=30 Apr 25 00:54:36.009385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009353 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl"] Apr 25 00:54:36.009651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009636 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="storage-initializer" Apr 25 00:54:36.009732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009654 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="storage-initializer" Apr 25 00:54:36.009732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009664 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kube-rbac-proxy" Apr 25 00:54:36.009732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009670 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kube-rbac-proxy" Apr 25 00:54:36.009732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" Apr 25 00:54:36.009732 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009692 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" Apr 25 00:54:36.009995 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009744 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kube-rbac-proxy" Apr 25 00:54:36.009995 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.009752 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="153bab5f-88b6-4b0e-bf7f-40ac12db9ced" containerName="kserve-container" Apr 25 00:54:36.012429 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.012412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.014651 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.014626 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 25 00:54:36.014758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.014678 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 25 00:54:36.014851 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.014837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 25 00:54:36.021468 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.021443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl"] Apr 25 00:54:36.093878 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.093851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.094034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.093883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.094034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.093902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr5tx\" (UniqueName: \"kubernetes.io/projected/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kube-api-access-qr5tx\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.094034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.093942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.094034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.094025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.113768 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.113742 2576 generic.go:358] "Generic (PLEG): container finished" podID="33472dc9-805c-42f8-ae87-105a010b793a" containerID="123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933" exitCode=2 Apr 25 00:54:36.113881 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.113821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerDied","Data":"123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933"} Apr 25 00:54:36.194370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.194303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.194370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.194334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.194370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.194358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5tx\" (UniqueName: \"kubernetes.io/projected/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kube-api-access-qr5tx\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.194568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.194377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.194568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.194416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.194568 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:54:36.194510 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 25 00:54:36.194568 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:54:36.194564 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls podName:7a8ed903-d44c-4e32-aff3-f7fc0ff0512e nodeName:}" failed. No retries permitted until 2026-04-25 00:54:36.694542261 +0000 UTC m=+3636.975659162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls") pod "isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" (UID: "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e") : secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 25 00:54:36.194772 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.194739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.195040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.195023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.195083 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.195065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.204777 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.204758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5tx\" (UniqueName: \"kubernetes.io/projected/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kube-api-access-qr5tx\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.698801 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.698767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.701276 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.701247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:36.907901 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.907859 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.62:8643/healthz\": dial tcp 10.134.0.62:8643: connect: connection refused" Apr 25 00:54:36.912184 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.912158 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 25 00:54:36.924454 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:36.924433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:37.044467 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:37.044433 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl"] Apr 25 00:54:37.047371 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:54:37.047344 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8ed903_d44c_4e32_aff3_f7fc0ff0512e.slice/crio-2a1cf38b0a01b1b3a52f41e1d7e6168612893261e46b7f8ffb3fc5ec5cf61f32 WatchSource:0}: Error finding container 2a1cf38b0a01b1b3a52f41e1d7e6168612893261e46b7f8ffb3fc5ec5cf61f32: Status 404 returned error can't find the container with id 2a1cf38b0a01b1b3a52f41e1d7e6168612893261e46b7f8ffb3fc5ec5cf61f32 Apr 25 00:54:37.117743 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:37.117710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerStarted","Data":"25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86"} Apr 25 00:54:37.117830 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:37.117747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerStarted","Data":"2a1cf38b0a01b1b3a52f41e1d7e6168612893261e46b7f8ffb3fc5ec5cf61f32"} Apr 25 00:54:37.194935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:37.194886 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:54:37.201540 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:37.201522 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:54:38.121609 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:38.121581 2576 generic.go:358] "Generic (PLEG): container finished" podID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerID="25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86" exitCode=0 Apr 25 00:54:38.121935 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:38.121662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerDied","Data":"25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86"} Apr 25 00:54:39.126753 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:39.126703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerStarted","Data":"646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85"} Apr 25 00:54:39.127187 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:39.126763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerStarted","Data":"3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d"} Apr 25 00:54:39.127187 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:39.127057 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:39.152581 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:39.152541 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podStartSLOduration=4.152528531 podStartE2EDuration="4.152528531s" podCreationTimestamp="2026-04-25 00:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:54:39.150092286 +0000 UTC m=+3639.431209205" watchObservedRunningTime="2026-04-25 00:54:39.152528531 +0000 UTC m=+3639.433645451" Apr 25 00:54:40.030027 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.030006 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:54:40.125261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.125233 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33472dc9-805c-42f8-ae87-105a010b793a-kserve-provision-location\") pod \"33472dc9-805c-42f8-ae87-105a010b793a\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " Apr 25 00:54:40.125443 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.125290 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csp2n\" (UniqueName: \"kubernetes.io/projected/33472dc9-805c-42f8-ae87-105a010b793a-kube-api-access-csp2n\") pod \"33472dc9-805c-42f8-ae87-105a010b793a\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " Apr 25 00:54:40.125443 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.125329 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33472dc9-805c-42f8-ae87-105a010b793a-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"33472dc9-805c-42f8-ae87-105a010b793a\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " Apr 25 00:54:40.125443 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.125362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls\") pod \"33472dc9-805c-42f8-ae87-105a010b793a\" (UID: \"33472dc9-805c-42f8-ae87-105a010b793a\") " Apr 25 00:54:40.125638 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.125545 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33472dc9-805c-42f8-ae87-105a010b793a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33472dc9-805c-42f8-ae87-105a010b793a" (UID: "33472dc9-805c-42f8-ae87-105a010b793a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:54:40.125735 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.125712 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33472dc9-805c-42f8-ae87-105a010b793a-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "33472dc9-805c-42f8-ae87-105a010b793a" (UID: "33472dc9-805c-42f8-ae87-105a010b793a"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:54:40.127565 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.127537 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "33472dc9-805c-42f8-ae87-105a010b793a" (UID: "33472dc9-805c-42f8-ae87-105a010b793a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:54:40.127950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.127573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33472dc9-805c-42f8-ae87-105a010b793a-kube-api-access-csp2n" (OuterVolumeSpecName: "kube-api-access-csp2n") pod "33472dc9-805c-42f8-ae87-105a010b793a" (UID: "33472dc9-805c-42f8-ae87-105a010b793a"). InnerVolumeSpecName "kube-api-access-csp2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:54:40.131173 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.131148 2576 generic.go:358] "Generic (PLEG): container finished" podID="33472dc9-805c-42f8-ae87-105a010b793a" containerID="c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70" exitCode=0 Apr 25 00:54:40.131292 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.131231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerDied","Data":"c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70"} Apr 25 00:54:40.131292 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.131282 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" event={"ID":"33472dc9-805c-42f8-ae87-105a010b793a","Type":"ContainerDied","Data":"d9546f15266cf00e424fd07f416c546ffc31743633a263253807c9374e3f9fc2"} Apr 25 00:54:40.131417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.131303 2576 scope.go:117] "RemoveContainer" containerID="123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933" Apr 25 00:54:40.131417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.131245 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p" Apr 25 00:54:40.131523 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.131502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:40.132898 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.132870 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:54:40.139116 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.139103 2576 scope.go:117] "RemoveContainer" containerID="c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70" Apr 25 00:54:40.146192 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.146176 2576 scope.go:117] "RemoveContainer" containerID="dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208" Apr 25 00:54:40.152697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.152681 2576 scope.go:117] "RemoveContainer" containerID="123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933" Apr 25 00:54:40.152952 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:54:40.152933 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933\": container with ID starting with 123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933 not found: ID does not exist" containerID="123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933" Apr 25 00:54:40.153002 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.152959 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933"} err="failed to get container status \"123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933\": rpc error: code = NotFound desc = could not find container \"123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933\": container with ID starting with 123e512fe9319e379e8d255d072dfd81cdebdb00bdd3f77d43e33fc581c3f933 not found: ID does not exist" Apr 25 00:54:40.153002 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.152975 2576 scope.go:117] "RemoveContainer" containerID="c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70" Apr 25 00:54:40.153198 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:54:40.153182 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70\": container with ID starting with c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70 not found: ID does not exist" containerID="c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70" Apr 25 00:54:40.153287 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.153201 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70"} err="failed to get container status \"c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70\": rpc error: code = NotFound desc = could not find container \"c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70\": container with ID starting with c68709b4ed5bb12476855a66d8cf0f0f18458d22ce64d4b597c4bb5c6bb23c70 not found: ID does not exist" Apr 25 00:54:40.153287 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.153213 2576 scope.go:117] "RemoveContainer" containerID="dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208" Apr 25 00:54:40.153416 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:54:40.153401 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208\": container with ID starting with dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208 not found: ID does not exist" containerID="dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208" Apr 25 00:54:40.153454 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.153420 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208"} err="failed to get container status \"dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208\": rpc error: code = NotFound desc = could not find container \"dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208\": container with ID starting with dc7a68e5e570bb0d18ca8d58eb759e53cf43cfe1881f07dbc77bb638ef63b208 not found: ID does not exist" Apr 25 00:54:40.162112 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.162087 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p"] Apr 25 00:54:40.167422 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.167398 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-68847956b-cl76p"] Apr 25 00:54:40.226909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.226869 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33472dc9-805c-42f8-ae87-105a010b793a-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:54:40.226909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.226909 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-csp2n\" (UniqueName: \"kubernetes.io/projected/33472dc9-805c-42f8-ae87-105a010b793a-kube-api-access-csp2n\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:54:40.227086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.226948 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33472dc9-805c-42f8-ae87-105a010b793a-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:54:40.227086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.226962 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33472dc9-805c-42f8-ae87-105a010b793a-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:54:40.314892 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:40.314865 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33472dc9-805c-42f8-ae87-105a010b793a" path="/var/lib/kubelet/pods/33472dc9-805c-42f8-ae87-105a010b793a/volumes" Apr 25 00:54:41.135576 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:41.135538 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:54:46.140362 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:46.140336 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:54:46.140883 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:46.140860 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:54:56.141419 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:54:56.141377 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:55:02.290980 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:02.290949 2576 scope.go:117] "RemoveContainer" containerID="8429b79f2b11810d1f29d8995987cb0892823179111de4ce97ccc8c195273c76" Apr 25 00:55:02.298431 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:02.298412 2576 scope.go:117] "RemoveContainer" containerID="1a321231077007fa3b0cd768db5f175c964f2f76eeb1481bd23014cb6a4cc20b" Apr 25 00:55:02.305349 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:02.305329 2576 scope.go:117] "RemoveContainer" containerID="475a7b566f34c3fc29ceee907725f2b95c14a55a60e200795ecb144db6c48137" Apr 25 00:55:06.141600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:06.141562 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:55:16.141378 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:16.141334 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:55:26.141522 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:26.141479 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:55:36.141637 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:36.141596 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:55:46.142094 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:46.142056 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:55:56.046033 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.045986 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl"] Apr 25 00:55:56.046461 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.046309 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" containerID="cri-o://3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d" gracePeriod=30 Apr 25 00:55:56.046461 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.046329 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kube-rbac-proxy" containerID="cri-o://646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85" gracePeriod=30 Apr 25 00:55:56.136068 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.136027 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.63:8643/healthz\": dial tcp 10.134.0.63:8643: connect: connection refused" Apr 25 00:55:56.141815 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.141782 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 25 00:55:56.348206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.348119 2576 generic.go:358] "Generic (PLEG): container finished" podID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerID="646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85" exitCode=2 Apr 25 00:55:56.348206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:56.348167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerDied","Data":"646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85"} Apr 25 00:55:57.124545 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124507 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6"] Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124824 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124837 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124853 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="storage-initializer" Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124858 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="storage-initializer" Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124866 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kube-rbac-proxy" Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124872 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kube-rbac-proxy" Apr 25 00:55:57.124909 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124935 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kserve-container" Apr 25 00:55:57.125178 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.124945 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="33472dc9-805c-42f8-ae87-105a010b793a" containerName="kube-rbac-proxy" Apr 25 00:55:57.127737 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.127721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.130040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.130021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 25 00:55:57.130040 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.130033 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 25 00:55:57.138327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.138304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6"] Apr 25 00:55:57.285349 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.285304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.285534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.285376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.285534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.285398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zdtq\" (UniqueName: \"kubernetes.io/projected/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kube-api-access-4zdtq\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.285534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.285417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.385854 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.385767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.385854 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.385800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zdtq\" (UniqueName: \"kubernetes.io/projected/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kube-api-access-4zdtq\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.385854 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.385823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.386168 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.385885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.386264 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.386238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.386491 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.386472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.388600 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.388579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.394585 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.394559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zdtq\" (UniqueName: \"kubernetes.io/projected/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kube-api-access-4zdtq\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.438108 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.438088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:55:57.618249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:57.618214 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6"] Apr 25 00:55:57.621484 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:55:57.621455 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c1ee2a_1919_4f24_ab75_8490c5dc6ef1.slice/crio-b039e836b029b53183c587ac1fb90a8497c9c11a370aa29b159a721ca871c9be WatchSource:0}: Error finding container b039e836b029b53183c587ac1fb90a8497c9c11a370aa29b159a721ca871c9be: Status 404 returned error can't find the container with id b039e836b029b53183c587ac1fb90a8497c9c11a370aa29b159a721ca871c9be Apr 25 00:55:58.354367 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:58.354332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" event={"ID":"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1","Type":"ContainerStarted","Data":"66829d3926140842221de120c887abe83578a7ec715e987cb3698c7a0ba94d88"} Apr 25 00:55:58.354367 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:55:58.354368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" event={"ID":"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1","Type":"ContainerStarted","Data":"b039e836b029b53183c587ac1fb90a8497c9c11a370aa29b159a721ca871c9be"} Apr 25 00:56:00.179280 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.179257 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:56:00.309615 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.309579 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls\") pod \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " Apr 25 00:56:00.309784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.309638 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-cabundle-cert\") pod \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " Apr 25 00:56:00.309784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.309663 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr5tx\" (UniqueName: \"kubernetes.io/projected/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kube-api-access-qr5tx\") pod \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " Apr 25 00:56:00.309784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.309708 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kserve-provision-location\") pod \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " Apr 25 00:56:00.309784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.309727 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\" (UID: \"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e\") " Apr 25 00:56:00.310139 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.310106 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" (UID: "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:56:00.310139 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.310124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" (UID: "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:56:00.310139 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.310133 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" (UID: "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:56:00.311962 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.311942 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" (UID: "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:56:00.312137 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.312117 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kube-api-access-qr5tx" (OuterVolumeSpecName: "kube-api-access-qr5tx") pod "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" (UID: "7a8ed903-d44c-4e32-aff3-f7fc0ff0512e"). InnerVolumeSpecName "kube-api-access-qr5tx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:56:00.361148 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.361121 2576 generic.go:358] "Generic (PLEG): container finished" podID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerID="3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d" exitCode=0 Apr 25 00:56:00.361248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.361156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerDied","Data":"3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d"} Apr 25 00:56:00.361248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.361182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" event={"ID":"7a8ed903-d44c-4e32-aff3-f7fc0ff0512e","Type":"ContainerDied","Data":"2a1cf38b0a01b1b3a52f41e1d7e6168612893261e46b7f8ffb3fc5ec5cf61f32"} Apr 25 00:56:00.361248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.361191 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl" Apr 25 00:56:00.361248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.361200 2576 scope.go:117] "RemoveContainer" containerID="646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85" Apr 25 00:56:00.368988 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.368970 2576 scope.go:117] "RemoveContainer" containerID="3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d" Apr 25 00:56:00.375613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.375597 2576 scope.go:117] "RemoveContainer" containerID="25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86" Apr 25 00:56:00.379904 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.379881 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl"] Apr 25 00:56:00.383280 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.383264 2576 scope.go:117] "RemoveContainer" containerID="646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85" Apr 25 00:56:00.383562 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:56:00.383537 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85\": container with ID starting with 646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85 not found: ID does not exist" containerID="646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85" Apr 25 00:56:00.383634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.383575 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85"} err="failed to get container status \"646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85\": rpc error: code = NotFound desc = could not find container \"646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85\": container with ID starting with 646e3efd80fe70984b265c9bf72aa781981ebd3894233e4c44e806303e4bdf85 not found: ID does not exist" Apr 25 00:56:00.383634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.383601 2576 scope.go:117] "RemoveContainer" containerID="3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d" Apr 25 00:56:00.383875 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:56:00.383849 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d\": container with ID starting with 3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d not found: ID does not exist" containerID="3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d" Apr 25 00:56:00.383954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.383886 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d"} err="failed to get container status \"3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d\": rpc error: code = NotFound desc = could not find container \"3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d\": container with ID starting with 3a24e146a678d5d5aeae8a5ac530528db80785d97f9cacc0dcec59f1fa89402d not found: ID does not exist" Apr 25 00:56:00.383954 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.383910 2576 scope.go:117] "RemoveContainer" containerID="25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86" Apr 25 00:56:00.384067 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.383964 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-847874886b-rvhkl"] Apr 25 00:56:00.384178 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:56:00.384161 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86\": container with ID starting with 25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86 not found: ID does not exist" containerID="25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86" Apr 25 00:56:00.384218 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.384183 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86"} err="failed to get container status \"25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86\": rpc error: code = NotFound desc = could not find container \"25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86\": container with ID starting with 25331556e5d23ca5468395a979bb2b0ae5108e66c2245a19efe11addd81afd86 not found: ID does not exist" Apr 25 00:56:00.410698 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.410676 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:00.410764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.410698 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:00.410764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.410708 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:00.410764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.410718 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-cabundle-cert\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:00.410764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:00.410726 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qr5tx\" (UniqueName: \"kubernetes.io/projected/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e-kube-api-access-qr5tx\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:01.365819 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:01.365784 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6_c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/storage-initializer/0.log" Apr 25 00:56:01.365819 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:01.365821 2576 generic.go:358] "Generic (PLEG): container finished" podID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerID="66829d3926140842221de120c887abe83578a7ec715e987cb3698c7a0ba94d88" exitCode=1 Apr 25 00:56:01.366319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:01.365870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" event={"ID":"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1","Type":"ContainerDied","Data":"66829d3926140842221de120c887abe83578a7ec715e987cb3698c7a0ba94d88"} Apr 25 00:56:02.315073 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:02.315041 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" path="/var/lib/kubelet/pods/7a8ed903-d44c-4e32-aff3-f7fc0ff0512e/volumes" Apr 25 00:56:02.370285 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:02.370260 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6_c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/storage-initializer/0.log" Apr 25 00:56:02.370611 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:02.370340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" event={"ID":"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1","Type":"ContainerStarted","Data":"842b7012c6cb3352c63798b47990a87f94876cbbcaec8d6be358b58c6bdd3c6c"} Apr 25 00:56:07.096692 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.096659 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6"] Apr 25 00:56:07.097221 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.096967 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" containerID="cri-o://842b7012c6cb3352c63798b47990a87f94876cbbcaec8d6be358b58c6bdd3c6c" gracePeriod=30 Apr 25 00:56:07.386542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.386518 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6_c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/storage-initializer/1.log" Apr 25 00:56:07.386983 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.386962 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6_c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/storage-initializer/0.log" Apr 25 00:56:07.387111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.386997 2576 generic.go:358] "Generic (PLEG): container finished" podID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerID="842b7012c6cb3352c63798b47990a87f94876cbbcaec8d6be358b58c6bdd3c6c" exitCode=1 Apr 25 00:56:07.387111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.387049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" event={"ID":"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1","Type":"ContainerDied","Data":"842b7012c6cb3352c63798b47990a87f94876cbbcaec8d6be358b58c6bdd3c6c"} Apr 25 00:56:07.387111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.387077 2576 scope.go:117] "RemoveContainer" containerID="66829d3926140842221de120c887abe83578a7ec715e987cb3698c7a0ba94d88" Apr 25 00:56:07.437467 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.437443 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6_c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/storage-initializer/1.log" Apr 25 00:56:07.437570 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.437520 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:56:07.559175 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.559142 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kserve-provision-location\") pod \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " Apr 25 00:56:07.559343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.559184 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " Apr 25 00:56:07.559343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.559212 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zdtq\" (UniqueName: \"kubernetes.io/projected/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kube-api-access-4zdtq\") pod \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " Apr 25 00:56:07.559343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.559241 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-proxy-tls\") pod \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\" (UID: \"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1\") " Apr 25 00:56:07.559523 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.559477 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" (UID: "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:56:07.559604 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.559577 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" (UID: "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:56:07.561508 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.561486 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" (UID: "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:56:07.561591 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.561529 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kube-api-access-4zdtq" (OuterVolumeSpecName: "kube-api-access-4zdtq") pod "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" (UID: "c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1"). InnerVolumeSpecName "kube-api-access-4zdtq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:56:07.659950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.659858 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:07.659950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.659886 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:07.659950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.659897 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zdtq\" (UniqueName: \"kubernetes.io/projected/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-kube-api-access-4zdtq\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:07.659950 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:07.659906 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:56:08.195593 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195551 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v"] Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195857 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kube-rbac-proxy" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195875 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kube-rbac-proxy" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195890 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195898 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195909 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="storage-initializer" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195945 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="storage-initializer" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195966 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195975 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195986 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" Apr 25 00:56:08.196001 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.195995 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" Apr 25 00:56:08.196360 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.196085 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" Apr 25 00:56:08.196360 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.196099 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" containerName="storage-initializer" Apr 25 00:56:08.196360 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.196111 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kserve-container" Apr 25 00:56:08.196360 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.196121 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a8ed903-d44c-4e32-aff3-f7fc0ff0512e" containerName="kube-rbac-proxy" Apr 25 00:56:08.200710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.200690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.203153 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.203128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 25 00:56:08.203301 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.203168 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 25 00:56:08.203301 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.203208 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 25 00:56:08.208755 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.208733 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v"] Apr 25 00:56:08.365374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.365342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.365374 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.365377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfnnw\" (UniqueName: \"kubernetes.io/projected/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kube-api-access-xfnnw\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.365568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.365407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.365568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.365462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.365568 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.365527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.390738 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.390710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6_c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/storage-initializer/1.log" Apr 25 00:56:08.390953 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.390809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" event={"ID":"c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1","Type":"ContainerDied","Data":"b039e836b029b53183c587ac1fb90a8497c9c11a370aa29b159a721ca871c9be"} Apr 25 00:56:08.390953 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.390859 2576 scope.go:117] "RemoveContainer" containerID="842b7012c6cb3352c63798b47990a87f94876cbbcaec8d6be358b58c6bdd3c6c" Apr 25 00:56:08.390953 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.390823 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6" Apr 25 00:56:08.422956 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.422905 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6"] Apr 25 00:56:08.427792 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.427769 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-74cbdcf44d-n8kt6"] Apr 25 00:56:08.465960 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.465872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.465960 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.465904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfnnw\" (UniqueName: \"kubernetes.io/projected/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kube-api-access-xfnnw\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.466119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.466045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.466119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.466098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.466201 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.466162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.466319 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.466296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.466673 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.466654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.466739 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.466708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.468704 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.468683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.474714 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.474691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfnnw\" (UniqueName: \"kubernetes.io/projected/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kube-api-access-xfnnw\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.511994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.511967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:08.637425 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:08.637395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v"] Apr 25 00:56:08.639360 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:56:08.639333 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93df935b_4dd4_4369_a49c_bf7e95f8d7ed.slice/crio-a3f4f2a82469e3f1b221214f867141d5512494bebe0cc739f96cb470d62d62dd WatchSource:0}: Error finding container a3f4f2a82469e3f1b221214f867141d5512494bebe0cc739f96cb470d62d62dd: Status 404 returned error can't find the container with id a3f4f2a82469e3f1b221214f867141d5512494bebe0cc739f96cb470d62d62dd Apr 25 00:56:09.395759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:09.395724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerStarted","Data":"14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477"} Apr 25 00:56:09.395759 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:09.395766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerStarted","Data":"a3f4f2a82469e3f1b221214f867141d5512494bebe0cc739f96cb470d62d62dd"} Apr 25 00:56:10.315830 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:10.315798 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1" path="/var/lib/kubelet/pods/c3c1ee2a-1919-4f24-ab75-8490c5dc6ef1/volumes" Apr 25 00:56:10.399639 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:10.399601 2576 generic.go:358] "Generic (PLEG): container finished" podID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerID="14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477" exitCode=0 Apr 25 00:56:10.400027 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:10.399687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerDied","Data":"14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477"} Apr 25 00:56:11.404089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:11.404055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerStarted","Data":"a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6"} Apr 25 00:56:11.404089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:11.404095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerStarted","Data":"8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796"} Apr 25 00:56:11.404495 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:11.404197 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:11.421683 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:11.421640 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podStartSLOduration=3.421627956 podStartE2EDuration="3.421627956s" podCreationTimestamp="2026-04-25 00:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:56:11.420419788 +0000 UTC m=+3731.701536707" watchObservedRunningTime="2026-04-25 00:56:11.421627956 +0000 UTC m=+3731.702744872" Apr 25 00:56:12.407310 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:12.407278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:12.408544 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:12.408515 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:56:13.409852 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:13.409810 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:56:18.414189 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:18.414161 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:56:18.414711 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:18.414688 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:56:28.415489 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:28.415446 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:56:38.415491 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:38.415451 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:56:48.415034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:48.414989 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:56:58.415561 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:56:58.415522 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:57:08.414887 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:08.414845 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:57:18.415856 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:18.415824 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:57:28.216278 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.216246 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v"] Apr 25 00:57:28.216680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.216558 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" containerID="cri-o://8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796" gracePeriod=30 Apr 25 00:57:28.216680 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.216601 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kube-rbac-proxy" containerID="cri-o://a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6" gracePeriod=30 Apr 25 00:57:28.410796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.410755 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.65:8643/healthz\": dial tcp 10.134.0.65:8643: connect: connection refused" Apr 25 00:57:28.415052 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.415026 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 25 00:57:28.627237 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.627205 2576 generic.go:358] "Generic (PLEG): container finished" podID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerID="a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6" exitCode=2 Apr 25 00:57:28.627417 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:28.627264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerDied","Data":"a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6"} Apr 25 00:57:29.294906 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.294866 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6"] Apr 25 00:57:29.298297 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.298278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.300775 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.300753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 25 00:57:29.300883 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.300757 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 25 00:57:29.310552 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.310528 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6"] Apr 25 00:57:29.447086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.447052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.447086 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.447090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.447313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.447171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7n9r\" (UniqueName: \"kubernetes.io/projected/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kube-api-access-n7n9r\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.447313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.447221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0896ccd-ea9e-4476-92c5-79097a4f7e60-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.548480 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.548368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.548480 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.548423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.548480 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.548458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7n9r\" (UniqueName: \"kubernetes.io/projected/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kube-api-access-n7n9r\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.548480 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.548479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0896ccd-ea9e-4476-92c5-79097a4f7e60-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.548858 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:29.548600 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 25 00:57:29.548858 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:29.548678 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls podName:e0896ccd-ea9e-4476-92c5-79097a4f7e60 nodeName:}" failed. No retries permitted until 2026-04-25 00:57:30.048660105 +0000 UTC m=+3810.329777003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" (UID: "e0896ccd-ea9e-4476-92c5-79097a4f7e60") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 25 00:57:29.548858 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.548831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.549119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.549100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0896ccd-ea9e-4476-92c5-79097a4f7e60-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:29.557399 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:29.557367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7n9r\" (UniqueName: \"kubernetes.io/projected/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kube-api-access-n7n9r\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:30.054063 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:30.054013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:30.056551 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:30.056527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:30.208529 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:30.208470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:30.327136 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:30.327064 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6"] Apr 25 00:57:30.330360 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:57:30.330331 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0896ccd_ea9e_4476_92c5_79097a4f7e60.slice/crio-3a54e767976c8d50291f7362a10d136229bb270c18e28312fdead186ae46dd43 WatchSource:0}: Error finding container 3a54e767976c8d50291f7362a10d136229bb270c18e28312fdead186ae46dd43: Status 404 returned error can't find the container with id 3a54e767976c8d50291f7362a10d136229bb270c18e28312fdead186ae46dd43 Apr 25 00:57:30.634791 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:30.634710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" event={"ID":"e0896ccd-ea9e-4476-92c5-79097a4f7e60","Type":"ContainerStarted","Data":"c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36"} Apr 25 00:57:30.634791 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:30.634751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" event={"ID":"e0896ccd-ea9e-4476-92c5-79097a4f7e60","Type":"ContainerStarted","Data":"3a54e767976c8d50291f7362a10d136229bb270c18e28312fdead186ae46dd43"} Apr 25 00:57:32.356119 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.356096 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:57:32.474381 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474296 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfnnw\" (UniqueName: \"kubernetes.io/projected/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kube-api-access-xfnnw\") pod \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " Apr 25 00:57:32.474381 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474353 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " Apr 25 00:57:32.474596 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474395 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kserve-provision-location\") pod \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " Apr 25 00:57:32.474596 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474435 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-proxy-tls\") pod \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " Apr 25 00:57:32.474596 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474465 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-cabundle-cert\") pod \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\" (UID: \"93df935b-4dd4-4369-a49c-bf7e95f8d7ed\") " Apr 25 00:57:32.474851 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474810 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "93df935b-4dd4-4369-a49c-bf7e95f8d7ed" (UID: "93df935b-4dd4-4369-a49c-bf7e95f8d7ed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:57:32.474851 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474824 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "93df935b-4dd4-4369-a49c-bf7e95f8d7ed" (UID: "93df935b-4dd4-4369-a49c-bf7e95f8d7ed"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:57:32.474851 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.474839 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "93df935b-4dd4-4369-a49c-bf7e95f8d7ed" (UID: "93df935b-4dd4-4369-a49c-bf7e95f8d7ed"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:57:32.476710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.476688 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "93df935b-4dd4-4369-a49c-bf7e95f8d7ed" (UID: "93df935b-4dd4-4369-a49c-bf7e95f8d7ed"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:57:32.476710 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.476702 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kube-api-access-xfnnw" (OuterVolumeSpecName: "kube-api-access-xfnnw") pod "93df935b-4dd4-4369-a49c-bf7e95f8d7ed" (UID: "93df935b-4dd4-4369-a49c-bf7e95f8d7ed"). InnerVolumeSpecName "kube-api-access-xfnnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:57:32.575268 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.575218 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:32.575268 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.575263 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:32.575268 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.575274 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:32.575268 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.575284 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-cabundle-cert\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:32.575543 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.575294 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfnnw\" (UniqueName: \"kubernetes.io/projected/93df935b-4dd4-4369-a49c-bf7e95f8d7ed-kube-api-access-xfnnw\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:32.642224 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.642189 2576 generic.go:358] "Generic (PLEG): container finished" podID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerID="8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796" exitCode=0 Apr 25 00:57:32.642348 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.642234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerDied","Data":"8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796"} Apr 25 00:57:32.642348 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.642261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" event={"ID":"93df935b-4dd4-4369-a49c-bf7e95f8d7ed","Type":"ContainerDied","Data":"a3f4f2a82469e3f1b221214f867141d5512494bebe0cc739f96cb470d62d62dd"} Apr 25 00:57:32.642348 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.642270 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v" Apr 25 00:57:32.642469 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.642275 2576 scope.go:117] "RemoveContainer" containerID="a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6" Apr 25 00:57:32.650786 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.650765 2576 scope.go:117] "RemoveContainer" containerID="8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796" Apr 25 00:57:32.657810 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.657793 2576 scope.go:117] "RemoveContainer" containerID="14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477" Apr 25 00:57:32.664318 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.664293 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v"] Apr 25 00:57:32.666151 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.666101 2576 scope.go:117] "RemoveContainer" containerID="a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6" Apr 25 00:57:32.666419 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:32.666400 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6\": container with ID starting with a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6 not found: ID does not exist" containerID="a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6" Apr 25 00:57:32.666488 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.666429 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6"} err="failed to get container status \"a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6\": rpc error: code = NotFound desc = could not find container \"a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6\": container with ID starting with a0333be83f6add50b85901f4a59beccafec90f1efb27c2e15229641a1517bef6 not found: ID does not exist" Apr 25 00:57:32.666488 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.666449 2576 scope.go:117] "RemoveContainer" containerID="8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796" Apr 25 00:57:32.666728 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:32.666714 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796\": container with ID starting with 8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796 not found: ID does not exist" containerID="8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796" Apr 25 00:57:32.666784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.666732 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796"} err="failed to get container status \"8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796\": rpc error: code = NotFound desc = could not find container \"8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796\": container with ID starting with 8d2ac09e1d58437466495ea91ab72e007f13c1ef971feaa55dcbda0537410796 not found: ID does not exist" Apr 25 00:57:32.666784 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.666746 2576 scope.go:117] "RemoveContainer" containerID="14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477" Apr 25 00:57:32.667114 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:32.667085 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477\": container with ID starting with 14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477 not found: ID does not exist" containerID="14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477" Apr 25 00:57:32.667190 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.667120 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477"} err="failed to get container status \"14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477\": rpc error: code = NotFound desc = could not find container \"14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477\": container with ID starting with 14e3260dc30bf678e78ae451203b5d0a439a905ad925d13d2374c3fdfe098477 not found: ID does not exist" Apr 25 00:57:32.667487 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:32.667470 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-79c8d8749d-52m5v"] Apr 25 00:57:34.315561 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:34.315523 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" path="/var/lib/kubelet/pods/93df935b-4dd4-4369-a49c-bf7e95f8d7ed/volumes" Apr 25 00:57:34.649376 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:34.649347 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6_e0896ccd-ea9e-4476-92c5-79097a4f7e60/storage-initializer/0.log" Apr 25 00:57:34.649542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:34.649389 2576 generic.go:358] "Generic (PLEG): container finished" podID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerID="c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36" exitCode=1 Apr 25 00:57:34.649542 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:34.649460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" event={"ID":"e0896ccd-ea9e-4476-92c5-79097a4f7e60","Type":"ContainerDied","Data":"c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36"} Apr 25 00:57:35.656408 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:35.656360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6_e0896ccd-ea9e-4476-92c5-79097a4f7e60/storage-initializer/0.log" Apr 25 00:57:35.656799 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:35.656499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" event={"ID":"e0896ccd-ea9e-4476-92c5-79097a4f7e60","Type":"ContainerStarted","Data":"428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5"} Apr 25 00:57:39.277697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.277667 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6"] Apr 25 00:57:39.278117 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.277952 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" containerID="cri-o://428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5" gracePeriod=30 Apr 25 00:57:39.407618 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.407596 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6_e0896ccd-ea9e-4476-92c5-79097a4f7e60/storage-initializer/1.log" Apr 25 00:57:39.407970 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.407954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6_e0896ccd-ea9e-4476-92c5-79097a4f7e60/storage-initializer/0.log" Apr 25 00:57:39.408085 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.408030 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:39.531392 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531293 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kserve-provision-location\") pod \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " Apr 25 00:57:39.531392 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531360 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0896ccd-ea9e-4476-92c5-79097a4f7e60-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " Apr 25 00:57:39.531392 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531391 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls\") pod \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " Apr 25 00:57:39.531670 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531421 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7n9r\" (UniqueName: \"kubernetes.io/projected/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kube-api-access-n7n9r\") pod \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\" (UID: \"e0896ccd-ea9e-4476-92c5-79097a4f7e60\") " Apr 25 00:57:39.531670 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531548 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e0896ccd-ea9e-4476-92c5-79097a4f7e60" (UID: "e0896ccd-ea9e-4476-92c5-79097a4f7e60"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:57:39.531670 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531637 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:39.531806 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.531779 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0896ccd-ea9e-4476-92c5-79097a4f7e60-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "e0896ccd-ea9e-4476-92c5-79097a4f7e60" (UID: "e0896ccd-ea9e-4476-92c5-79097a4f7e60"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:57:39.533635 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.533610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e0896ccd-ea9e-4476-92c5-79097a4f7e60" (UID: "e0896ccd-ea9e-4476-92c5-79097a4f7e60"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:57:39.533742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.533672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kube-api-access-n7n9r" (OuterVolumeSpecName: "kube-api-access-n7n9r") pod "e0896ccd-ea9e-4476-92c5-79097a4f7e60" (UID: "e0896ccd-ea9e-4476-92c5-79097a4f7e60"). InnerVolumeSpecName "kube-api-access-n7n9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:57:39.632538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.632500 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e0896ccd-ea9e-4476-92c5-79097a4f7e60-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:39.632538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.632530 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e0896ccd-ea9e-4476-92c5-79097a4f7e60-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:39.632538 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.632542 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7n9r\" (UniqueName: \"kubernetes.io/projected/e0896ccd-ea9e-4476-92c5-79097a4f7e60-kube-api-access-n7n9r\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:57:39.668847 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.668815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6_e0896ccd-ea9e-4476-92c5-79097a4f7e60/storage-initializer/1.log" Apr 25 00:57:39.669198 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.669183 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6_e0896ccd-ea9e-4476-92c5-79097a4f7e60/storage-initializer/0.log" Apr 25 00:57:39.669250 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.669221 2576 generic.go:358] "Generic (PLEG): container finished" podID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerID="428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5" exitCode=1 Apr 25 00:57:39.669313 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.669297 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" Apr 25 00:57:39.669351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.669312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" event={"ID":"e0896ccd-ea9e-4476-92c5-79097a4f7e60","Type":"ContainerDied","Data":"428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5"} Apr 25 00:57:39.669385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.669350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6" event={"ID":"e0896ccd-ea9e-4476-92c5-79097a4f7e60","Type":"ContainerDied","Data":"3a54e767976c8d50291f7362a10d136229bb270c18e28312fdead186ae46dd43"} Apr 25 00:57:39.669385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.669365 2576 scope.go:117] "RemoveContainer" containerID="428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5" Apr 25 00:57:39.677249 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.677232 2576 scope.go:117] "RemoveContainer" containerID="c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36" Apr 25 00:57:39.684242 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.684223 2576 scope.go:117] "RemoveContainer" containerID="428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5" Apr 25 00:57:39.684482 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:39.684457 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5\": container with ID starting with 428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5 not found: ID does not exist" containerID="428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5" Apr 25 00:57:39.684563 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.684487 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5"} err="failed to get container status \"428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5\": rpc error: code = NotFound desc = could not find container \"428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5\": container with ID starting with 428195014eb4678893aef9b5f43af15c702084118c801940106e73a93834b8c5 not found: ID does not exist" Apr 25 00:57:39.684563 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.684508 2576 scope.go:117] "RemoveContainer" containerID="c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36" Apr 25 00:57:39.684743 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:57:39.684724 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36\": container with ID starting with c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36 not found: ID does not exist" containerID="c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36" Apr 25 00:57:39.684802 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.684751 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36"} err="failed to get container status \"c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36\": rpc error: code = NotFound desc = could not find container \"c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36\": container with ID starting with c6d278cc5d17b0dab3b3c6d295f9531a65544535dc1f71255fa668eff9368c36 not found: ID does not exist" Apr 25 00:57:39.703646 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.703619 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6"] Apr 25 00:57:39.706661 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:39.706640 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-67d4ccc548-f99b6"] Apr 25 00:57:40.315347 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.315311 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" path="/var/lib/kubelet/pods/e0896ccd-ea9e-4476-92c5-79097a4f7e60/volumes" Apr 25 00:57:40.345596 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345566 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2"] Apr 25 00:57:40.345816 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345804 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" Apr 25 00:57:40.345872 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345818 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" Apr 25 00:57:40.345872 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345829 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="storage-initializer" Apr 25 00:57:40.345872 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345835 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="storage-initializer" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345877 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kube-rbac-proxy" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345882 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kube-rbac-proxy" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345891 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345896 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345953 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kube-rbac-proxy" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345963 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="93df935b-4dd4-4369-a49c-bf7e95f8d7ed" containerName="kserve-container" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345970 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" Apr 25 00:57:40.345994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.345976 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" Apr 25 00:57:40.346316 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.346018 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" Apr 25 00:57:40.346316 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.346023 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0896ccd-ea9e-4476-92c5-79097a4f7e60" containerName="storage-initializer" Apr 25 00:57:40.350252 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.350232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.353345 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.353319 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 25 00:57:40.353465 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.353442 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:57:40.354166 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.354150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 25 00:57:40.354166 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.354163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kz9zk\"" Apr 25 00:57:40.354351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.354171 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:57:40.354351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.354182 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 25 00:57:40.354351 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.354265 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 25 00:57:40.357895 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.357876 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2"] Apr 25 00:57:40.538197 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.538162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87hh2\" (UniqueName: \"kubernetes.io/projected/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kube-api-access-87hh2\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.538385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.538215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.538385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.538299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.538385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.538341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.538385 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.538374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/765f9e1b-e1d5-42da-9355-c7cc69f9131a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.639412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.639327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87hh2\" (UniqueName: \"kubernetes.io/projected/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kube-api-access-87hh2\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.639412 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.639375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.639587 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.639549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.639646 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.639599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.639646 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.639632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/765f9e1b-e1d5-42da-9355-c7cc69f9131a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.640007 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.639986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.640178 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.640156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.640256 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.640234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.642223 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.642207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/765f9e1b-e1d5-42da-9355-c7cc69f9131a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.647478 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.647460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87hh2\" (UniqueName: \"kubernetes.io/projected/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kube-api-access-87hh2\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.661407 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.661388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:40.785781 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:40.785748 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2"] Apr 25 00:57:40.788974 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:57:40.788941 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765f9e1b_e1d5_42da_9355_c7cc69f9131a.slice/crio-dbcf70063685f9d1e9c1c6bae04bebbfaa6f1ab948c083ed211de40f8bedc259 WatchSource:0}: Error finding container dbcf70063685f9d1e9c1c6bae04bebbfaa6f1ab948c083ed211de40f8bedc259: Status 404 returned error can't find the container with id dbcf70063685f9d1e9c1c6bae04bebbfaa6f1ab948c083ed211de40f8bedc259 Apr 25 00:57:41.677847 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:41.677805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerStarted","Data":"a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d"} Apr 25 00:57:41.677847 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:41.677846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerStarted","Data":"dbcf70063685f9d1e9c1c6bae04bebbfaa6f1ab948c083ed211de40f8bedc259"} Apr 25 00:57:42.682553 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:42.682516 2576 generic.go:358] "Generic (PLEG): container finished" podID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerID="a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d" exitCode=0 Apr 25 00:57:42.682999 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:42.682592 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerDied","Data":"a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d"} Apr 25 00:57:43.687530 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:43.687498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerStarted","Data":"3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd"} Apr 25 00:57:43.687530 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:43.687533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerStarted","Data":"ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375"} Apr 25 00:57:43.688020 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:43.687750 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:43.688020 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:43.687847 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:43.689179 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:43.689153 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:57:43.705036 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:43.704993 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podStartSLOduration=3.704981952 podStartE2EDuration="3.704981952s" podCreationTimestamp="2026-04-25 00:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:57:43.703379034 +0000 UTC m=+3823.984495954" watchObservedRunningTime="2026-04-25 00:57:43.704981952 +0000 UTC m=+3823.986098951" Apr 25 00:57:44.691029 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:44.690993 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:57:49.695684 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:49.695653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:57:49.696329 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:49.696299 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:57:59.696220 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:57:59.696177 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:58:09.696698 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:09.696608 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:58:19.696426 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:19.696384 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:58:29.697230 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:29.697192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:58:39.696652 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:39.696614 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 25 00:58:49.697097 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:49.697069 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:58:50.374016 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:50.373978 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2"] Apr 25 00:58:50.374343 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:50.374309 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" containerID="cri-o://ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375" gracePeriod=30 Apr 25 00:58:50.374483 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:50.374315 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kube-rbac-proxy" containerID="cri-o://3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd" gracePeriod=30 Apr 25 00:58:50.869442 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:50.869410 2576 generic.go:358] "Generic (PLEG): container finished" podID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerID="3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd" exitCode=2 Apr 25 00:58:50.869796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:50.869447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerDied","Data":"3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd"} Apr 25 00:58:51.447612 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.447570 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf"] Apr 25 00:58:51.450757 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.450737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.453122 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.453097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 25 00:58:51.453254 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.453101 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 25 00:58:51.463166 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.463125 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf"] Apr 25 00:58:51.544713 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.544677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db82k\" (UniqueName: \"kubernetes.io/projected/23e458c7-711b-489a-a13b-451a333ad874-kube-api-access-db82k\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.544902 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.544743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e458c7-711b-489a-a13b-451a333ad874-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.544902 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.544780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.544902 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.544816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/23e458c7-711b-489a-a13b-451a333ad874-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.645737 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.645702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db82k\" (UniqueName: \"kubernetes.io/projected/23e458c7-711b-489a-a13b-451a333ad874-kube-api-access-db82k\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.645888 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.645751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e458c7-711b-489a-a13b-451a333ad874-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.645888 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.645868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.646065 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.645954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/23e458c7-711b-489a-a13b-451a333ad874-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.646065 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:58:51.646032 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 25 00:58:51.646187 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:58:51.646120 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls podName:23e458c7-711b-489a-a13b-451a333ad874 nodeName:}" failed. No retries permitted until 2026-04-25 00:58:52.146097229 +0000 UTC m=+3892.427214129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" (UID: "23e458c7-711b-489a-a13b-451a333ad874") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 25 00:58:51.646252 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.646185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e458c7-711b-489a-a13b-451a333ad874-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.646602 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.646580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/23e458c7-711b-489a-a13b-451a333ad874-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:51.655236 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:51.655216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db82k\" (UniqueName: \"kubernetes.io/projected/23e458c7-711b-489a-a13b-451a333ad874-kube-api-access-db82k\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:52.150192 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.150157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:52.152757 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.152733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:52.360485 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.360449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:58:52.480667 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.480611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf"] Apr 25 00:58:52.486272 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:58:52.486231 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e458c7_711b_489a_a13b_451a333ad874.slice/crio-6111f2758128c3106384189f70cf5660aecca5a494f67e08aa1f8175c7205212 WatchSource:0}: Error finding container 6111f2758128c3106384189f70cf5660aecca5a494f67e08aa1f8175c7205212: Status 404 returned error can't find the container with id 6111f2758128c3106384189f70cf5660aecca5a494f67e08aa1f8175c7205212 Apr 25 00:58:52.488512 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.488496 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:58:52.876796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.876751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" event={"ID":"23e458c7-711b-489a-a13b-451a333ad874","Type":"ContainerStarted","Data":"bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296"} Apr 25 00:58:52.876796 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:52.876788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" event={"ID":"23e458c7-711b-489a-a13b-451a333ad874","Type":"ContainerStarted","Data":"6111f2758128c3106384189f70cf5660aecca5a494f67e08aa1f8175c7205212"} Apr 25 00:58:54.417533 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.417512 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:58:54.466884 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.466820 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-cabundle-cert\") pod \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " Apr 25 00:58:54.466884 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.466854 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " Apr 25 00:58:54.467087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.466892 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87hh2\" (UniqueName: \"kubernetes.io/projected/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kube-api-access-87hh2\") pod \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " Apr 25 00:58:54.467087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.466960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kserve-provision-location\") pod \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " Apr 25 00:58:54.467087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.467003 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/765f9e1b-e1d5-42da-9355-c7cc69f9131a-proxy-tls\") pod \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\" (UID: \"765f9e1b-e1d5-42da-9355-c7cc69f9131a\") " Apr 25 00:58:54.467295 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.467276 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "765f9e1b-e1d5-42da-9355-c7cc69f9131a" (UID: "765f9e1b-e1d5-42da-9355-c7cc69f9131a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:58:54.467352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.467289 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "765f9e1b-e1d5-42da-9355-c7cc69f9131a" (UID: "765f9e1b-e1d5-42da-9355-c7cc69f9131a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:58:54.467352 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.467305 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "765f9e1b-e1d5-42da-9355-c7cc69f9131a" (UID: "765f9e1b-e1d5-42da-9355-c7cc69f9131a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:58:54.469248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.469221 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kube-api-access-87hh2" (OuterVolumeSpecName: "kube-api-access-87hh2") pod "765f9e1b-e1d5-42da-9355-c7cc69f9131a" (UID: "765f9e1b-e1d5-42da-9355-c7cc69f9131a"). InnerVolumeSpecName "kube-api-access-87hh2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:58:54.469248 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.469236 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765f9e1b-e1d5-42da-9355-c7cc69f9131a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "765f9e1b-e1d5-42da-9355-c7cc69f9131a" (UID: "765f9e1b-e1d5-42da-9355-c7cc69f9131a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:58:54.567895 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.567855 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-cabundle-cert\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:58:54.567895 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.567890 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/765f9e1b-e1d5-42da-9355-c7cc69f9131a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:58:54.568136 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.567904 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87hh2\" (UniqueName: \"kubernetes.io/projected/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kube-api-access-87hh2\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:58:54.568136 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.567949 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/765f9e1b-e1d5-42da-9355-c7cc69f9131a-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:58:54.568136 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.567961 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/765f9e1b-e1d5-42da-9355-c7cc69f9131a-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:58:54.883890 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.883853 2576 generic.go:358] "Generic (PLEG): container finished" podID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerID="ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375" exitCode=0 Apr 25 00:58:54.884114 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.883942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerDied","Data":"ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375"} Apr 25 00:58:54.884114 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.883971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" event={"ID":"765f9e1b-e1d5-42da-9355-c7cc69f9131a","Type":"ContainerDied","Data":"dbcf70063685f9d1e9c1c6bae04bebbfaa6f1ab948c083ed211de40f8bedc259"} Apr 25 00:58:54.884114 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.883971 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2" Apr 25 00:58:54.884114 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.883987 2576 scope.go:117] "RemoveContainer" containerID="3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd" Apr 25 00:58:54.892171 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.892152 2576 scope.go:117] "RemoveContainer" containerID="ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375" Apr 25 00:58:54.899328 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.899311 2576 scope.go:117] "RemoveContainer" containerID="a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d" Apr 25 00:58:54.906697 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.906674 2576 scope.go:117] "RemoveContainer" containerID="3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd" Apr 25 00:58:54.906982 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:58:54.906956 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd\": container with ID starting with 3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd not found: ID does not exist" containerID="3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd" Apr 25 00:58:54.907053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.906994 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd"} err="failed to get container status \"3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd\": rpc error: code = NotFound desc = could not find container \"3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd\": container with ID starting with 3386d410924268e2351af5fcb503e3cfeba2d1e232d4b86038ef9f41c554b2fd not found: ID does not exist" Apr 25 00:58:54.907053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.907045 2576 scope.go:117] "RemoveContainer" containerID="ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375" Apr 25 00:58:54.907462 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.907436 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2"] Apr 25 00:58:54.907561 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:58:54.907515 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375\": container with ID starting with ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375 not found: ID does not exist" containerID="ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375" Apr 25 00:58:54.907561 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.907543 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375"} err="failed to get container status \"ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375\": rpc error: code = NotFound desc = could not find container \"ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375\": container with ID starting with ca694a020d1d6dbc20796f5a438882366506de52f52bdfbb75ca68f2b13da375 not found: ID does not exist" Apr 25 00:58:54.907758 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.907566 2576 scope.go:117] "RemoveContainer" containerID="a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d" Apr 25 00:58:54.908335 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:58:54.907973 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d\": container with ID starting with a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d not found: ID does not exist" containerID="a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d" Apr 25 00:58:54.908335 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.908010 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d"} err="failed to get container status \"a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d\": rpc error: code = NotFound desc = could not find container \"a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d\": container with ID starting with a20e189985e54d86b485545ac95b8f316b93d675ed73f3f7edfd6e191a22d08d not found: ID does not exist" Apr 25 00:58:54.909430 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:54.909411 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-b4cf7979c-mhjf2"] Apr 25 00:58:56.315327 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:56.315294 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" path="/var/lib/kubelet/pods/765f9e1b-e1d5-42da-9355-c7cc69f9131a/volumes" Apr 25 00:58:58.896628 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:58.896602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf_23e458c7-711b-489a-a13b-451a333ad874/storage-initializer/0.log" Apr 25 00:58:58.897053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:58.896642 2576 generic.go:358] "Generic (PLEG): container finished" podID="23e458c7-711b-489a-a13b-451a333ad874" containerID="bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296" exitCode=1 Apr 25 00:58:58.897053 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:58.896680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" event={"ID":"23e458c7-711b-489a-a13b-451a333ad874","Type":"ContainerDied","Data":"bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296"} Apr 25 00:58:59.901707 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:59.901677 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf_23e458c7-711b-489a-a13b-451a333ad874/storage-initializer/0.log" Apr 25 00:58:59.902222 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:58:59.901787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" event={"ID":"23e458c7-711b-489a-a13b-451a333ad874","Type":"ContainerStarted","Data":"88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f"} Apr 25 00:59:01.442419 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:01.442381 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf"] Apr 25 00:59:01.442898 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:01.442666 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" containerID="cri-o://88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f" gracePeriod=30 Apr 25 00:59:02.578742 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.578719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf_23e458c7-711b-489a-a13b-451a333ad874/storage-initializer/1.log" Apr 25 00:59:02.579134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.579118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf_23e458c7-711b-489a-a13b-451a333ad874/storage-initializer/0.log" Apr 25 00:59:02.579193 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.579183 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:59:02.732111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.732026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db82k\" (UniqueName: \"kubernetes.io/projected/23e458c7-711b-489a-a13b-451a333ad874-kube-api-access-db82k\") pod \"23e458c7-711b-489a-a13b-451a333ad874\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " Apr 25 00:59:02.732111 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.732072 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/23e458c7-711b-489a-a13b-451a333ad874-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"23e458c7-711b-489a-a13b-451a333ad874\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " Apr 25 00:59:02.732332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.732114 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls\") pod \"23e458c7-711b-489a-a13b-451a333ad874\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " Apr 25 00:59:02.732332 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.732162 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e458c7-711b-489a-a13b-451a333ad874-kserve-provision-location\") pod \"23e458c7-711b-489a-a13b-451a333ad874\" (UID: \"23e458c7-711b-489a-a13b-451a333ad874\") " Apr 25 00:59:02.732445 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.732409 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e458c7-711b-489a-a13b-451a333ad874-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "23e458c7-711b-489a-a13b-451a333ad874" (UID: "23e458c7-711b-489a-a13b-451a333ad874"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:59:02.732445 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.732431 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e458c7-711b-489a-a13b-451a333ad874-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "23e458c7-711b-489a-a13b-451a333ad874" (UID: "23e458c7-711b-489a-a13b-451a333ad874"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:59:02.734181 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.734158 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e458c7-711b-489a-a13b-451a333ad874-kube-api-access-db82k" (OuterVolumeSpecName: "kube-api-access-db82k") pod "23e458c7-711b-489a-a13b-451a333ad874" (UID: "23e458c7-711b-489a-a13b-451a333ad874"). InnerVolumeSpecName "kube-api-access-db82k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:59:02.734261 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.734246 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "23e458c7-711b-489a-a13b-451a333ad874" (UID: "23e458c7-711b-489a-a13b-451a333ad874"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:59:02.833307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.833271 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23e458c7-711b-489a-a13b-451a333ad874-proxy-tls\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:59:02.833307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.833304 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e458c7-711b-489a-a13b-451a333ad874-kserve-provision-location\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:59:02.833307 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.833313 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-db82k\" (UniqueName: \"kubernetes.io/projected/23e458c7-711b-489a-a13b-451a333ad874-kube-api-access-db82k\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:59:02.833476 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.833323 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/23e458c7-711b-489a-a13b-451a333ad874-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-109.ec2.internal\" DevicePath \"\"" Apr 25 00:59:02.912613 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.912591 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf_23e458c7-711b-489a-a13b-451a333ad874/storage-initializer/1.log" Apr 25 00:59:02.912989 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.912975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf_23e458c7-711b-489a-a13b-451a333ad874/storage-initializer/0.log" Apr 25 00:59:02.913041 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.913006 2576 generic.go:358] "Generic (PLEG): container finished" podID="23e458c7-711b-489a-a13b-451a333ad874" containerID="88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f" exitCode=1 Apr 25 00:59:02.913078 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.913032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" event={"ID":"23e458c7-711b-489a-a13b-451a333ad874","Type":"ContainerDied","Data":"88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f"} Apr 25 00:59:02.913078 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.913067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" event={"ID":"23e458c7-711b-489a-a13b-451a333ad874","Type":"ContainerDied","Data":"6111f2758128c3106384189f70cf5660aecca5a494f67e08aa1f8175c7205212"} Apr 25 00:59:02.913146 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.913082 2576 scope.go:117] "RemoveContainer" containerID="88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f" Apr 25 00:59:02.913146 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.913093 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf" Apr 25 00:59:02.920764 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.920734 2576 scope.go:117] "RemoveContainer" containerID="bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296" Apr 25 00:59:02.927719 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.927703 2576 scope.go:117] "RemoveContainer" containerID="88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f" Apr 25 00:59:02.927965 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:59:02.927945 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f\": container with ID starting with 88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f not found: ID does not exist" containerID="88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f" Apr 25 00:59:02.928034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.927972 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f"} err="failed to get container status \"88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f\": rpc error: code = NotFound desc = could not find container \"88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f\": container with ID starting with 88c377d98539550c9b0aa1ab9c0deb9bc553ae64edf3dca8cbfa69ebef67c40f not found: ID does not exist" Apr 25 00:59:02.928034 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.927989 2576 scope.go:117] "RemoveContainer" containerID="bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296" Apr 25 00:59:02.928223 ip-10-0-129-109 kubenswrapper[2576]: E0425 00:59:02.928206 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296\": container with ID starting with bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296 not found: ID does not exist" containerID="bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296" Apr 25 00:59:02.928262 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.928229 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296"} err="failed to get container status \"bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296\": rpc error: code = NotFound desc = could not find container \"bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296\": container with ID starting with bf5a266db2ccb2f5ca19f0d736830e3c3f1b87323402ad1962b63c9e4427c296 not found: ID does not exist" Apr 25 00:59:02.952420 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.952391 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf"] Apr 25 00:59:02.956493 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:02.956465 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5b5b4d5d99-8qppf"] Apr 25 00:59:04.315030 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:04.314996 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e458c7-711b-489a-a13b-451a333ad874" path="/var/lib/kubelet/pods/23e458c7-711b-489a-a13b-451a333ad874/volumes" Apr 25 00:59:30.912013 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.911973 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-htmzv/must-gather-j4s46"] Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912248 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912259 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912273 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912278 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912285 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kube-rbac-proxy" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912291 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kube-rbac-proxy" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912304 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912311 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912322 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912327 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912370 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912383 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kserve-container" Apr 25 00:59:30.912446 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912392 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="765f9e1b-e1d5-42da-9355-c7cc69f9131a" containerName="kube-rbac-proxy" Apr 25 00:59:30.912843 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.912469 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e458c7-711b-489a-a13b-451a333ad874" containerName="storage-initializer" Apr 25 00:59:30.915205 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.915188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:30.917574 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.917543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-htmzv\"/\"default-dockercfg-tmtwr\"" Apr 25 00:59:30.918603 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.918584 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-htmzv\"/\"kube-root-ca.crt\"" Apr 25 00:59:30.918603 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.918596 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-htmzv\"/\"openshift-service-ca.crt\"" Apr 25 00:59:30.922087 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:30.922063 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/must-gather-j4s46"] Apr 25 00:59:31.032206 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.032181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8ce701f-4878-4077-a3de-2847a5db04f3-must-gather-output\") pod \"must-gather-j4s46\" (UID: \"b8ce701f-4878-4077-a3de-2847a5db04f3\") " pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.032359 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.032234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6wl\" (UniqueName: \"kubernetes.io/projected/b8ce701f-4878-4077-a3de-2847a5db04f3-kube-api-access-6m6wl\") pod \"must-gather-j4s46\" (UID: \"b8ce701f-4878-4077-a3de-2847a5db04f3\") " pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.133066 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.133023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6wl\" (UniqueName: \"kubernetes.io/projected/b8ce701f-4878-4077-a3de-2847a5db04f3-kube-api-access-6m6wl\") pod \"must-gather-j4s46\" (UID: \"b8ce701f-4878-4077-a3de-2847a5db04f3\") " pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.133211 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.133084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8ce701f-4878-4077-a3de-2847a5db04f3-must-gather-output\") pod \"must-gather-j4s46\" (UID: \"b8ce701f-4878-4077-a3de-2847a5db04f3\") " pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.133369 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.133349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b8ce701f-4878-4077-a3de-2847a5db04f3-must-gather-output\") pod \"must-gather-j4s46\" (UID: \"b8ce701f-4878-4077-a3de-2847a5db04f3\") " pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.141255 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.141230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6wl\" (UniqueName: \"kubernetes.io/projected/b8ce701f-4878-4077-a3de-2847a5db04f3-kube-api-access-6m6wl\") pod \"must-gather-j4s46\" (UID: \"b8ce701f-4878-4077-a3de-2847a5db04f3\") " pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.225370 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.225304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/must-gather-j4s46" Apr 25 00:59:31.340800 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.340767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/must-gather-j4s46"] Apr 25 00:59:31.343871 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:59:31.343838 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8ce701f_4878_4077_a3de_2847a5db04f3.slice/crio-0a55f4b7bb321aa2858fba9f59a475d74bea26febfe80278db2a82891ae57972 WatchSource:0}: Error finding container 0a55f4b7bb321aa2858fba9f59a475d74bea26febfe80278db2a82891ae57972: Status 404 returned error can't find the container with id 0a55f4b7bb321aa2858fba9f59a475d74bea26febfe80278db2a82891ae57972 Apr 25 00:59:31.999427 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:31.999397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/must-gather-j4s46" event={"ID":"b8ce701f-4878-4077-a3de-2847a5db04f3","Type":"ContainerStarted","Data":"0a55f4b7bb321aa2858fba9f59a475d74bea26febfe80278db2a82891ae57972"} Apr 25 00:59:33.010982 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:33.010942 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/must-gather-j4s46" event={"ID":"b8ce701f-4878-4077-a3de-2847a5db04f3","Type":"ContainerStarted","Data":"6d3047725b677ae115803cffae558b0ecb0c4856a84f35275837ae1328725268"} Apr 25 00:59:33.011459 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:33.011324 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/must-gather-j4s46" event={"ID":"b8ce701f-4878-4077-a3de-2847a5db04f3","Type":"ContainerStarted","Data":"0c9ca8966a5a36121d9b24146ecc784fe1473de2a727cf949e8d2c8d0baaa425"} Apr 25 00:59:33.029772 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:33.029716 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-htmzv/must-gather-j4s46" podStartSLOduration=2.180186425 podStartE2EDuration="3.029699635s" podCreationTimestamp="2026-04-25 00:59:30 +0000 UTC" firstStartedPulling="2026-04-25 00:59:31.345962719 +0000 UTC m=+3931.627079617" lastFinishedPulling="2026-04-25 00:59:32.195475917 +0000 UTC m=+3932.476592827" observedRunningTime="2026-04-25 00:59:33.026521985 +0000 UTC m=+3933.307638906" watchObservedRunningTime="2026-04-25 00:59:33.029699635 +0000 UTC m=+3933.310816556" Apr 25 00:59:33.696068 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:33.696031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cgk2q_301e8349-12fd-4785-95ed-4b2e9b42b9a6/global-pull-secret-syncer/0.log" Apr 25 00:59:33.836879 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:33.836851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-48h4q_038b3357-ba5f-4aa6-8bda-d7a61161c9ce/konnectivity-agent/0.log" Apr 25 00:59:33.967062 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:33.966978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-109.ec2.internal_d2e34ff8c6ad1ea6f4df1f49c9e13e6c/haproxy/0.log" Apr 25 00:59:37.236154 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.236121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:59:37.237684 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.237659 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:59:37.708155 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.708123 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cq4m6_3341466c-0b0e-499e-8a69-b4a033f0e495/kube-state-metrics/0.log" Apr 25 00:59:37.728691 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.728660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cq4m6_3341466c-0b0e-499e-8a69-b4a033f0e495/kube-rbac-proxy-main/0.log" Apr 25 00:59:37.751793 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.751766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cq4m6_3341466c-0b0e-499e-8a69-b4a033f0e495/kube-rbac-proxy-self/0.log" Apr 25 00:59:37.910625 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.910589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-569p9_12c65432-33ba-4594-b447-d3f8ad398777/node-exporter/0.log" Apr 25 00:59:37.964795 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.964711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-569p9_12c65432-33ba-4594-b447-d3f8ad398777/kube-rbac-proxy/0.log" Apr 25 00:59:37.993104 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:37.993078 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-569p9_12c65432-33ba-4594-b447-d3f8ad398777/init-textfile/0.log" Apr 25 00:59:40.571089 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.571006 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm"] Apr 25 00:59:40.575569 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.575542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.583944 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.583900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm"] Apr 25 00:59:40.716467 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.716426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-podres\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.716634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.716494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-lib-modules\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.716634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.716565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-sys\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.716634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.716595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-proc\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.716634 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.716622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzx5n\" (UniqueName: \"kubernetes.io/projected/3d3d0397-8256-4fca-8b04-5bc310f76b3e-kube-api-access-dzx5n\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.817803 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-sys\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.817803 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-proc\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzx5n\" (UniqueName: \"kubernetes.io/projected/3d3d0397-8256-4fca-8b04-5bc310f76b3e-kube-api-access-dzx5n\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-podres\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-lib-modules\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-sys\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.817961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-proc\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.818023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-lib-modules\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.818084 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.818031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3d3d0397-8256-4fca-8b04-5bc310f76b3e-podres\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.825456 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.825400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzx5n\" (UniqueName: \"kubernetes.io/projected/3d3d0397-8256-4fca-8b04-5bc310f76b3e-kube-api-access-dzx5n\") pod \"perf-node-gather-daemonset-zq2pm\" (UID: \"3d3d0397-8256-4fca-8b04-5bc310f76b3e\") " pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:40.888516 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:40.888485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:41.026575 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:41.026532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm"] Apr 25 00:59:41.029127 ip-10-0-129-109 kubenswrapper[2576]: W0425 00:59:41.029096 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3d3d0397_8256_4fca_8b04_5bc310f76b3e.slice/crio-49b877d11102a6d80fb0035fed4d60ac5b155ce5a4ed4967bcec42e9c4b787e9 WatchSource:0}: Error finding container 49b877d11102a6d80fb0035fed4d60ac5b155ce5a4ed4967bcec42e9c4b787e9: Status 404 returned error can't find the container with id 49b877d11102a6d80fb0035fed4d60ac5b155ce5a4ed4967bcec42e9c4b787e9 Apr 25 00:59:41.041027 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:41.040997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" event={"ID":"3d3d0397-8256-4fca-8b04-5bc310f76b3e","Type":"ContainerStarted","Data":"49b877d11102a6d80fb0035fed4d60ac5b155ce5a4ed4967bcec42e9c4b787e9"} Apr 25 00:59:41.529145 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:41.529099 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fpgn9_972771be-01b9-4da1-b895-914fde15bc88/dns/0.log" Apr 25 00:59:41.548728 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:41.548710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fpgn9_972771be-01b9-4da1-b895-914fde15bc88/kube-rbac-proxy/0.log" Apr 25 00:59:41.611642 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:41.611616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58s9z_68924e4d-1b30-4887-a8bc-c624385685df/dns-node-resolver/0.log" Apr 25 00:59:42.046288 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:42.046258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" event={"ID":"3d3d0397-8256-4fca-8b04-5bc310f76b3e","Type":"ContainerStarted","Data":"e6ed8910abbfa35f96758308990e6e51659f71e79df6b150f24d5c5ce960bdd8"} Apr 25 00:59:42.046487 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:42.046443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:42.061504 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:42.061447 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" podStartSLOduration=2.06142996 podStartE2EDuration="2.06142996s" podCreationTimestamp="2026-04-25 00:59:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:59:42.059970392 +0000 UTC m=+3942.341087336" watchObservedRunningTime="2026-04-25 00:59:42.06142996 +0000 UTC m=+3942.342546881" Apr 25 00:59:42.152500 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:42.152461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6fc98_808b8c86-7996-4c7e-b677-dc648c7c5598/node-ca/0.log" Apr 25 00:59:43.151026 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:43.150984 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6zfbj_73aaed41-fe6b-4446-8ab2-95e11e051d4b/serve-healthcheck-canary/0.log" Apr 25 00:59:43.610259 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:43.610231 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bj6s9_21a2be45-b5f0-4fe5-aabf-c7b8783b56af/kube-rbac-proxy/0.log" Apr 25 00:59:43.629015 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:43.628983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bj6s9_21a2be45-b5f0-4fe5-aabf-c7b8783b56af/exporter/0.log" Apr 25 00:59:43.650042 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:43.650018 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bj6s9_21a2be45-b5f0-4fe5-aabf-c7b8783b56af/extractor/0.log" Apr 25 00:59:45.745844 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:45.745815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-cf89d_d6fe5d32-305c-48ba-af24-e0fc40b04868/server/0.log" Apr 25 00:59:45.995995 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:45.995894 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-wpr5g_5fd9f0c0-07d6-482b-b3b6-e3c5ee980597/manager/0.log" Apr 25 00:59:46.014147 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:46.014125 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-44llv_5a576370-66a9-4fd2-b563-7c3efaa5712d/s3-init/0.log" Apr 25 00:59:46.035393 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:46.035369 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-2g2bm_5337f27c-0430-45dc-99d5-dc9a28b10f64/s3-tls-init-custom/0.log" Apr 25 00:59:46.056981 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:46.056958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-9rbcb_9b14cfaf-b24c-4ab0-b9f9-2f378c08f836/s3-tls-init-serving/0.log" Apr 25 00:59:46.085439 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:46.085408 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-24pcs_81057cd7-9d33-4ce0-ad47-d4eaca1082b7/seaweedfs/0.log" Apr 25 00:59:46.127198 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:46.127168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-6jtgc_a0f88363-12b0-46f6-96d5-796d9680366f/seaweedfs-tls-serving/0.log" Apr 25 00:59:48.062134 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:48.062103 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-htmzv/perf-node-gather-daemonset-zq2pm" Apr 25 00:59:51.365384 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.365350 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/kube-multus-additional-cni-plugins/0.log" Apr 25 00:59:51.386328 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.386300 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/egress-router-binary-copy/0.log" Apr 25 00:59:51.408185 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.408157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/cni-plugins/0.log" Apr 25 00:59:51.430494 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.430459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/bond-cni-plugin/0.log" Apr 25 00:59:51.450336 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.450305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/routeoverride-cni/0.log" Apr 25 00:59:51.471660 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.471635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/whereabouts-cni-bincopy/0.log" Apr 25 00:59:51.495994 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.495957 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brnc4_f800603e-9119-44c9-9253-07fb97437cd7/whereabouts-cni/0.log" Apr 25 00:59:51.572242 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.572208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g9gll_2cb3d0c7-68be-4f8d-b00f-77ab92cbb94f/kube-multus/0.log" Apr 25 00:59:51.616392 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.616307 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fdw8f_6206bc2d-d85c-4007-8a04-e9eb243f590c/network-metrics-daemon/0.log" Apr 25 00:59:51.635155 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:51.635123 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fdw8f_6206bc2d-d85c-4007-8a04-e9eb243f590c/kube-rbac-proxy/0.log" Apr 25 00:59:52.417978 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.417948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-controller/0.log" Apr 25 00:59:52.432723 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.432698 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/0.log" Apr 25 00:59:52.450836 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.450810 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovn-acl-logging/1.log" Apr 25 00:59:52.468348 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.468329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/kube-rbac-proxy-node/0.log" Apr 25 00:59:52.490420 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.490399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:59:52.506745 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.506686 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/northd/0.log" Apr 25 00:59:52.525534 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.525513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/nbdb/0.log" Apr 25 00:59:52.544717 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.544694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/sbdb/0.log" Apr 25 00:59:52.664104 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:52.664069 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27ksn_27ed6ad4-863b-4379-8e79-0244d71ad92d/ovnkube-controller/0.log" Apr 25 00:59:54.272547 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:54.272517 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mthk5_acf3640a-1870-4ea5-b4cb-f6e0d7abccf0/network-check-target-container/0.log" Apr 25 00:59:55.160882 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:55.160856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qkrqf_ad8d234f-a974-4a38-8d63-b660058eeb43/iptables-alerter/0.log" Apr 25 00:59:55.721501 ip-10-0-129-109 kubenswrapper[2576]: I0425 00:59:55.721476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6lz9z_6633b011-7fd6-404a-b15b-b4d8f7c11aba/tuned/0.log"