Apr 24 16:39:01.635571 ip-10-0-142-182 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:39:01.992290 ip-10-0-142-182 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:01.992290 ip-10-0-142-182 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:39:01.992290 ip-10-0-142-182 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:01.992290 ip-10-0-142-182 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:39:01.992290 ip-10-0-142-182 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:01.993524 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.993377 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:39:01.996941 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996926 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996947 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996951 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996955 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996959 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996962 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996964 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996967 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996969 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996972 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996975 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996977 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996979 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996982 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996985 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:01.996981 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996988 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996991 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996993 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996996 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.996998 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997001 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997003 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997005 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997009 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997013 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997015 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997018 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997020 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997022 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997025 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997027 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997029 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997032 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997034 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997037 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:01.997344 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997039 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997042 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997044 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997046 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997049 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997051 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997053 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997056 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997058 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997060 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997063 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997065 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997067 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997069 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997074 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997077 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997079 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997081 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997084 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.997820 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997086 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997089 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997091 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997094 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997096 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997099 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997102 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997104 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997108 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997111 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997114 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997117 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997120 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997131 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997133 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997136 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997138 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997141 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997143 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:01.998299 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997146 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997148 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997151 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997153 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997156 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997158 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997160 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997162 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997165 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997168 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997171 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997173 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997176 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997600 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997606 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997608 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997611 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997614 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997616 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:01.998779 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997619 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997621 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997623 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997626 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997628 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997631 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997633 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997637 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997639 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997642 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997645 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997647 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997649 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997652 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997654 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997657 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997659 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997661 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997664 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997666 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:01.999213 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997668 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997671 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997673 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997676 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997678 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997681 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997683 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997686 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997688 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997691 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997693 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997696 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997699 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997701 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997703 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997706 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997708 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997711 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997713 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:01.999725 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997715 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997723 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997726 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997728 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997732 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997735 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997737 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997740 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997742 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997744 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997747 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997749 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997752 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997755 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997757 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997759 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997762 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997764 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997766 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997769 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:02.000198 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997771 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997773 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997776 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997783 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997786 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997788 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997791 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997795 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997799 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997802 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997804 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997807 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997809 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997812 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997820 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997822 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997825 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997827 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997830 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997832 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:02.000688 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.997835 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998387 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998398 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998403 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998408 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998412 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998415 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998419 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998423 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998426 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998429 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998432 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998435 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998438 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998441 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998444 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998447 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998449 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998452 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998455 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998462 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998464 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998467 2573 flags.go:64] FLAG: --config-dir="" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998470 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998473 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:39:02.001157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998477 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998479 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998484 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998487 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998489 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998492 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998495 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998498 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998501 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998505 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998508 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998511 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998514 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998517 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998520 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998526 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998529 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998532 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998535 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998539 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998542 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998545 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998548 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998551 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998554 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:39:02.001777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998556 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998559 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998562 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998565 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998567 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998570 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998573 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998576 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998579 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998582 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998585 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998588 2573 flags.go:64] FLAG: --help="false" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998590 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998593 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998596 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998599 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998603 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998606 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998609 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998612 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998614 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998617 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998620 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998623 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:39:02.002457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998626 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998628 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998631 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998634 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998637 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998640 2573 flags.go:64] FLAG: --lock-file="" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998642 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998645 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998648 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998653 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998656 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998658 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998661 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998664 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998667 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998670 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998673 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998677 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998680 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998684 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998687 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998690 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998693 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998696 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998699 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:39:02.003016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998702 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998705 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998712 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998715 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998718 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998720 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998723 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998728 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998731 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998734 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998736 2573 flags.go:64] FLAG: --port="10250" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998739 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998742 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-042737a498da6625c" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998745 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998748 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998750 2573 flags.go:64] FLAG: --register-node="true" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998753 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998756 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998759 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998762 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998764 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998767 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998770 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998773 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998776 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:39:02.003609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998778 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998781 2573 flags.go:64] FLAG: --runonce="false" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998784 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998787 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998792 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998794 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998797 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998800 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998803 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998809 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998811 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998814 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998817 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998820 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998823 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998826 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998828 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998833 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998836 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998839 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998847 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998850 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998852 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998855 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998858 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:39:02.004173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998861 2573 flags.go:64] FLAG: --v="2" Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998865 2573 flags.go:64] FLAG: --version="false" Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998869 2573 flags.go:64] FLAG: --vmodule="" Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998877 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.998880 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998976 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998980 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998983 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998985 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998988 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998991 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998996 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.998999 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999002 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999004 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999007 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999012 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999015 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999018 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999020 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:02.004773 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999023 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999025 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999028 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999030 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999032 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999035 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999037 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999040 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999043 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999046 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999048 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999050 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999053 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999055 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999057 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999060 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999062 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999064 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999067 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:02.005242 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999071 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999075 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999077 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999080 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999088 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999090 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999093 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999095 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999098 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999101 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999103 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999106 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999108 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999110 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999113 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999115 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999118 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999120 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999123 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999125 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:02.005833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999128 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999131 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999133 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999135 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999138 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999140 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999142 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999145 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999147 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999149 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999152 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999155 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999157 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999159 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999161 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999164 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999167 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999171 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999174 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:02.006620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999177 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999180 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999184 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999186 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999189 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999192 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999195 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999197 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999200 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999202 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999205 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999207 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:01.999209 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:02.007247 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:01.999715 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.007402 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.007417 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007464 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007469 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007473 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007475 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007478 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007481 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007484 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007487 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007489 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007493 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007497 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007500 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007503 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007505 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007508 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007510 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007513 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:02.007620 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007515 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007518 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007520 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007523 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007526 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007528 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007531 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007534 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007536 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007538 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007541 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007543 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007546 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007548 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007551 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007554 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007556 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007559 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007561 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007563 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:02.008087 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007566 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007568 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007570 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007574 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007578 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007581 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007585 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007588 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007590 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007593 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007596 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007598 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007600 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007603 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007605 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007608 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007610 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007613 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007615 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:02.008599 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007618 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007620 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007623 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007625 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007628 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007631 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007633 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007637 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007640 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007642 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007645 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007647 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007650 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007652 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007654 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007656 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007659 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007661 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007664 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007666 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:02.009044 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007669 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007672 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007675 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007677 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007679 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007682 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007684 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007687 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007689 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007691 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.007696 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007784 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007789 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007791 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007794 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:02.009558 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007796 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007799 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007802 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007804 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007806 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007809 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007811 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007814 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007816 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007819 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007821 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007823 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007826 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007828 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007830 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007833 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007835 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007837 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007841 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007843 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:02.009914 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007846 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007848 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007850 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007853 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007855 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007857 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007860 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007862 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007864 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007867 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007869 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007871 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007874 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007876 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007878 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007881 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007883 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007886 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007888 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007891 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:02.010494 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007893 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007897 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007901 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007903 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007906 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007908 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007911 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007913 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007915 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007918 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007920 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007923 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007925 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007927 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007929 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007932 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007934 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007936 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007939 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:02.010966 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007941 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007944 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007946 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007948 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007950 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007953 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007956 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007958 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007960 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007962 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007965 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007968 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007970 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007973 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007976 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007978 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007981 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007983 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007986 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:02.011431 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007988 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:02.011916 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007990 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:02.011916 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007992 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:02.011916 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:02.007995 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:02.011916 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.007999 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:02.011916 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.008764 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:39:02.011916 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.011486 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:39:02.012358 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.012348 2573 server.go:1019] "Starting client certificate rotation" Apr 24 16:39:02.012462 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.012445 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:02.012512 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.012499 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:02.034590 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.034570 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:02.037691 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.037551 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:02.047489 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.047465 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:39:02.052519 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.052505 2573 log.go:25] "Validated CRI v1 image API" Apr 24 16:39:02.054032 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.054018 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:39:02.058575 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.058556 2573 fs.go:135] Filesystem UUIDs: map[0a0e2503-c70d-44d9-8d4a-022a820d5bef:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e2c20c6c-270d-4763-94cc-b5c45f8f0f7e:/dev/nvme0n1p4] Apr 24 16:39:02.058651 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.058575 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:39:02.065573 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.065465 2573 manager.go:217] Machine: {Timestamp:2026-04-24 16:39:02.063805461 +0000 UTC m=+0.328520848 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3202261 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c0af92c4db7642831e56fb55b4a47 SystemUUID:ec2c0af9-2c4d-b764-2831-e56fb55b4a47 BootID:3327d310-61fc-4707-a0a5-237e09cb36e8 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ac:95:69:5b:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ac:95:69:5b:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:5e:dc:73:2a:de Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:39:02.065573 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.065563 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:39:02.065726 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.065640 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:39:02.066650 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.066615 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:39:02.066807 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.066653 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-182.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:39:02.066880 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.066817 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:39:02.066880 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.066829 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:39:02.066880 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.066848 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:02.067581 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.067568 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:02.068867 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.068856 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:02.069028 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.069017 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:39:02.070147 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.070132 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:02.071008 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.070996 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:39:02.071066 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.071014 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:39:02.071066 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.071030 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:39:02.071066 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.071042 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:39:02.071066 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.071054 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:39:02.072035 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.072022 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:02.072094 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.072044 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:02.074479 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.074454 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:39:02.075835 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.075821 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:39:02.077854 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077842 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077858 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077864 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077870 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077875 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077881 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077887 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077892 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077898 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:39:02.077907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077904 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:39:02.078146 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077912 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:39:02.078146 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.077920 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:39:02.078812 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.078801 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:39:02.078812 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.078813 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:39:02.081695 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.081681 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-182.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:39:02.082115 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.082103 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:39:02.082160 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.082138 2573 server.go:1295] "Started kubelet" Apr 24 16:39:02.082395 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.082371 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:39:02.082439 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.082371 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:39:02.082470 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.082438 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:39:02.082470 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.082421 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:39:02.082520 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.082481 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:39:02.082921 ip-10-0-142-182 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:39:02.084070 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.084054 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:39:02.084800 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.084786 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:39:02.088875 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.088857 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:39:02.088962 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.088876 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:02.089285 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.088288 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-182.ec2.internal.18a9586a6c5bf6d9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-182.ec2.internal,UID:ip-10-0-142-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-182.ec2.internal,},FirstTimestamp:2026-04-24 16:39:02.082115289 +0000 UTC m=+0.346830677,LastTimestamp:2026-04-24 16:39:02.082115289 +0000 UTC m=+0.346830677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-182.ec2.internal,}" Apr 24 16:39:02.089600 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.089586 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:39:02.089672 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.089588 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:39:02.089672 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.089634 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:39:02.089785 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.089619 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.089785 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.089751 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:39:02.089785 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.089761 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:39:02.093003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.092977 2573 factory.go:55] Registering systemd factory Apr 24 16:39:02.093003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.093002 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:39:02.093150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.093129 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:39:02.093263 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.093250 2573 factory.go:153] Registering CRI-O factory Apr 24 16:39:02.093351 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.093265 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 16:39:02.093351 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.093348 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:39:02.093417 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.093372 2573 factory.go:103] Registering Raw factory Apr 24 16:39:02.093417 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.093386 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 16:39:02.093600 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.093586 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:39:02.094026 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.094012 2573 manager.go:319] Starting recovery of all containers Apr 24 16:39:02.094223 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.094191 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:39:02.094683 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.094659 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-86z4d" Apr 24 16:39:02.102843 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.102822 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-86z4d" Apr 24 16:39:02.105329 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.105296 2573 manager.go:324] Recovery completed Apr 24 16:39:02.108971 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.108959 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:02.111256 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.111236 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:02.111351 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.111275 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:02.111351 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.111292 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:02.111823 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.111808 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:39:02.111823 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.111823 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:39:02.111905 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.111839 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:02.113492 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.113431 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-182.ec2.internal.18a9586a6e18a46c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-182.ec2.internal,UID:ip-10-0-142-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-182.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-182.ec2.internal,},FirstTimestamp:2026-04-24 16:39:02.111257708 +0000 UTC m=+0.375973102,LastTimestamp:2026-04-24 16:39:02.111257708 +0000 UTC m=+0.375973102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-182.ec2.internal,}" Apr 24 16:39:02.115075 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.115064 2573 policy_none.go:49] "None policy: Start" Apr 24 16:39:02.115118 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.115079 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:39:02.115118 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.115088 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.148877 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.149026 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.149039 2573 server.go:85] "Starting device plugin registration server" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.149262 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.149274 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.149373 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.149454 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.149462 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.149959 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:39:02.165300 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.149998 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.210409 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.210382 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:39:02.211744 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.211729 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:39:02.211827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.211756 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:39:02.211827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.211775 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:39:02.211827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.211781 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:39:02.211827 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.211813 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:39:02.214478 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.214463 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:02.250153 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.250106 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:02.250922 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.250908 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:02.251001 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.250941 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:02.251001 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.250955 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:02.251001 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.250984 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.260866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.260849 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.260930 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.260869 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-182.ec2.internal\": node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.279362 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.279345 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.311914 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.311893 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal"] Apr 24 16:39:02.311997 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.311964 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:02.313157 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.313144 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:02.313230 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.313175 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:02.313230 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.313190 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:02.315591 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.315576 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:02.315712 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.315697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.315757 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.315726 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:02.316244 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.316223 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:02.316337 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.316223 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:02.316337 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.316288 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:02.316337 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.316302 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:02.316337 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.316254 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:02.316476 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.316339 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:02.318625 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.318612 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.318683 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.318636 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:02.319271 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.319257 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:02.319360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.319284 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:02.319360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.319294 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:02.335857 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.335844 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-182.ec2.internal\" not found" node="ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.339544 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.339530 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-182.ec2.internal\" not found" node="ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.379877 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.379854 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.390963 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.390942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5d34cbf5b33a49e6ab42e122744bf511-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal\" (UID: \"5d34cbf5b33a49e6ab42e122744bf511\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.391038 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.390965 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d34cbf5b33a49e6ab42e122744bf511-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal\" (UID: \"5d34cbf5b33a49e6ab42e122744bf511\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.391038 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.390981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63bf82c7cb78a8f9ba0c1c606e45e954-config\") pod \"kube-apiserver-proxy-ip-10-0-142-182.ec2.internal\" (UID: \"63bf82c7cb78a8f9ba0c1c606e45e954\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.480289 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.480262 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.491649 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.491630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5d34cbf5b33a49e6ab42e122744bf511-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal\" (UID: \"5d34cbf5b33a49e6ab42e122744bf511\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.491714 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.491653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d34cbf5b33a49e6ab42e122744bf511-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal\" (UID: \"5d34cbf5b33a49e6ab42e122744bf511\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.491714 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.491669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63bf82c7cb78a8f9ba0c1c606e45e954-config\") pod \"kube-apiserver-proxy-ip-10-0-142-182.ec2.internal\" (UID: \"63bf82c7cb78a8f9ba0c1c606e45e954\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.491775 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.491715 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63bf82c7cb78a8f9ba0c1c606e45e954-config\") pod \"kube-apiserver-proxy-ip-10-0-142-182.ec2.internal\" (UID: \"63bf82c7cb78a8f9ba0c1c606e45e954\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.491775 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.491720 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5d34cbf5b33a49e6ab42e122744bf511-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal\" (UID: \"5d34cbf5b33a49e6ab42e122744bf511\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.491775 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.491742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5d34cbf5b33a49e6ab42e122744bf511-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal\" (UID: \"5d34cbf5b33a49e6ab42e122744bf511\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.581360 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.581299 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.637869 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.637839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.641754 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:02.641734 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" Apr 24 16:39:02.682226 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.682196 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.782836 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.782761 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.883270 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.883229 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:02.983908 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:02.983881 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:03.012368 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.012341 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:39:03.012740 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.012498 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:03.084636 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:03.084609 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:03.089201 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.089180 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:03.101084 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.101061 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:03.105554 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.105533 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:34:02 +0000 UTC" deadline="2028-01-06 02:19:05.733577272 +0000 UTC" Apr 24 16:39:03.105618 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.105556 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14913h40m2.628023333s" Apr 24 16:39:03.128817 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.128796 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qkdm5" Apr 24 16:39:03.137672 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.137652 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qkdm5" Apr 24 16:39:03.147616 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:03.147592 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d34cbf5b33a49e6ab42e122744bf511.slice/crio-4cdfc4a4cc61dbd6c2c4a0cb608a650b31db223bb51c2e52309e93a5f507b39a WatchSource:0}: Error finding container 4cdfc4a4cc61dbd6c2c4a0cb608a650b31db223bb51c2e52309e93a5f507b39a: Status 404 returned error can't find the container with id 4cdfc4a4cc61dbd6c2c4a0cb608a650b31db223bb51c2e52309e93a5f507b39a Apr 24 16:39:03.151123 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.151109 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:39:03.174056 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:03.174029 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63bf82c7cb78a8f9ba0c1c606e45e954.slice/crio-10524bde108a9c22bfb598c867cf91c3e224e09ef68d5513c3bee1d917618274 WatchSource:0}: Error finding container 10524bde108a9c22bfb598c867cf91c3e224e09ef68d5513c3bee1d917618274: Status 404 returned error can't find the container with id 10524bde108a9c22bfb598c867cf91c3e224e09ef68d5513c3bee1d917618274 Apr 24 16:39:03.184899 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:03.184883 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:03.215065 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.215006 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" event={"ID":"63bf82c7cb78a8f9ba0c1c606e45e954","Type":"ContainerStarted","Data":"10524bde108a9c22bfb598c867cf91c3e224e09ef68d5513c3bee1d917618274"} Apr 24 16:39:03.215805 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.215786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" event={"ID":"5d34cbf5b33a49e6ab42e122744bf511","Type":"ContainerStarted","Data":"4cdfc4a4cc61dbd6c2c4a0cb608a650b31db223bb51c2e52309e93a5f507b39a"} Apr 24 16:39:03.258587 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.258562 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:03.285491 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:03.285468 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-182.ec2.internal\" not found" Apr 24 16:39:03.292866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.292807 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:03.308830 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.308810 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:03.389800 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.389780 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" Apr 24 16:39:03.401484 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.401467 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:03.402389 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.402378 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" Apr 24 16:39:03.412690 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.412674 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:03.854380 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:03.854346 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:04.071955 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.071903 2573 apiserver.go:52] "Watching apiserver" Apr 24 16:39:04.078708 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.078678 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:39:04.079020 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.078999 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-djdhr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal","openshift-multus/multus-additional-cni-plugins-x52vd","openshift-network-diagnostics/network-check-target-mgmm7","openshift-network-operator/iptables-alerter-75zzc","openshift-ovn-kubernetes/ovnkube-node-lkfqt","kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal","openshift-cluster-node-tuning-operator/tuned-hz7p4","openshift-image-registry/node-ca-slcg8","openshift-multus/multus-tkpx8","openshift-multus/network-metrics-daemon-74mjh","kube-system/konnectivity-agent-m2nnx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv"] Apr 24 16:39:04.081989 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.081959 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.084180 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.084159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.084344 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.084321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.084530 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.084514 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.084617 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.084517 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:39:04.084733 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.084717 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-p7chn\"" Apr 24 16:39:04.085000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.084981 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:39:04.086471 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.086278 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:04.086471 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.086348 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:39:04.086471 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.086354 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:04.086471 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.086407 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gc979\"" Apr 24 16:39:04.086709 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.086606 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:39:04.088489 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.088474 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.090470 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.090454 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.090584 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.090569 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.091271 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.090765 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jgsvz\"" Apr 24 16:39:04.091271 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.090765 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:39:04.091271 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.090853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095513 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dvkg8\"" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095719 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095744 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095719 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095763 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095911 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:39:04.095980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095942 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:39:04.096373 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.095985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.098362 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098293 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.098362 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098354 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.098592 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098572 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.098739 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098722 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tgjhv\"" Apr 24 16:39:04.098859 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098838 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lqszl\"" Apr 24 16:39:04.098922 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098865 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.098994 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.098976 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:39:04.099206 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.099187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.100406 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100389 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100629 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-k8s-cni-cncf-io\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-cni-bin\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.100763 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100858 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mpw6l\"" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-node-log\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-system-cni-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-cnibin\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.100993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-host-slash\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-kubelet\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-ovn\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101111 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-ovnkube-config\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86788d4c-c402-4057-988b-77279d8fd61c-ovn-node-metrics-cert\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-hostroot\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-cni-binary-copy\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101203 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frfjh\" (UniqueName: \"kubernetes.io/projected/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-kube-api-access-frfjh\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-var-lib-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101240 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-os-release\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101264 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df6acdc8-67c6-4733-b49a-03a69f37ba5b-cni-binary-copy\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpjv4\" (UniqueName: \"kubernetes.io/projected/85c3e0af-8027-49c2-937d-99acbd5f7085-kube-api-access-gpjv4\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-systemd-units\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-systemd\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-env-overrides\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7sbg\" (UniqueName: \"kubernetes.io/projected/86788d4c-c402-4057-988b-77279d8fd61c-kube-api-access-h7sbg\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-multus-certs\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xhw\" (UniqueName: \"kubernetes.io/projected/df6acdc8-67c6-4733-b49a-03a69f37ba5b-kube-api-access-q4xhw\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.101866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-log-socket\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-cnibin\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-netns\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-conf-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-etc-kubernetes\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-iptables-alerter-script\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-slash\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-cni-multus\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-kubelet\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-run-netns\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-cni-bin\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-cni-netd\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-ovnkube-script-lib\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-cni-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.101975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-daemon-config\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.102540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.102007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-system-cni-dir\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.103383 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.102029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-os-release\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.103383 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.102052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-etc-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.103383 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.102074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.103383 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.102096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-socket-dir-parent\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.103383 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.103057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.104937 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.104895 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:39:04.105106 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.105092 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8md7w\"" Apr 24 16:39:04.105180 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.105113 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:39:04.105391 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.105372 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.107328 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.107290 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:39:04.107540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.107502 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cgfwf\"" Apr 24 16:39:04.107627 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.107526 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:39:04.107627 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.107565 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:39:04.138231 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.138204 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:03 +0000 UTC" deadline="2027-10-08 11:15:53.756098115 +0000 UTC" Apr 24 16:39:04.138231 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.138230 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12762h36m49.617871326s" Apr 24 16:39:04.190436 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.190402 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:39:04.202591 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.202591 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysctl-d\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-systemd\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-run-netns\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-ovnkube-script-lib\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-daemon-config\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202698 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-run-netns\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgvb5\" (UniqueName: \"kubernetes.io/projected/35142d06-6635-497c-8778-a5ebcc31867f-kube-api-access-hgvb5\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-etc-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.202781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202779 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-k8s-cni-cncf-io\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-cni-bin\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-k8s-cni-cncf-io\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-etc-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-host-slash\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.202978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-cni-bin\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysconfig\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-host-slash\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysctl-conf\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.203100 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-system-cni-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/086451b6-0c60-4e42-8500-d8f31f29bb1e-agent-certs\") pod \"konnectivity-agent-m2nnx\" (UID: \"086451b6-0c60-4e42-8500-d8f31f29bb1e\") " pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2267e8c5-f77e-4e17-a96f-9463ea75c147-tmp-dir\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-system-cni-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-kubelet\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-ovnkube-config\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-hostroot\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frfjh\" (UniqueName: \"kubernetes.io/projected/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-kube-api-access-frfjh\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-kubelet\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-os-release\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-hostroot\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-systemd-units\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-daemon-config\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-os-release\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-ovnkube-script-lib\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.203675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-multus-certs\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-systemd-units\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-multus-certs\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xhw\" (UniqueName: \"kubernetes.io/projected/df6acdc8-67c6-4733-b49a-03a69f37ba5b-kube-api-access-q4xhw\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-sys-fs\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-serviceca\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-log-socket\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-conf-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203600 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-log-socket\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-etc-kubernetes\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-conf-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-lib-modules\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-etc-kubernetes\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-slash\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-cni-multus\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-kubelet\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-slash\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-kubelet\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-var-lib-cni-multus\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-kubernetes\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203785 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-ovnkube-config\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-socket-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-cni-bin\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203861 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-cni-bin\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-cni-netd\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-cni-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-cni-netd\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-system-cni-dir\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-system-cni-dir\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-os-release\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.203997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-etc-selinux\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-socket-dir-parent\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cntsh\" (UniqueName: \"kubernetes.io/projected/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-kube-api-access-cntsh\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.204934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-cni-dir\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/086451b6-0c60-4e42-8500-d8f31f29bb1e-konnectivity-ca\") pod \"konnectivity-agent-m2nnx\" (UID: \"086451b6-0c60-4e42-8500-d8f31f29bb1e\") " pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-modprobe-d\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-os-release\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-multus-socket-dir-parent\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-sys\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204137 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-tuned\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-node-log\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-cnibin\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-registration-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhkn9\" (UniqueName: \"kubernetes.io/projected/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-kube-api-access-fhkn9\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/85c3e0af-8027-49c2-937d-99acbd5f7085-cnibin\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hwrx\" (UniqueName: \"kubernetes.io/projected/2267e8c5-f77e-4e17-a96f-9463ea75c147-kube-api-access-4hwrx\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-node-log\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204239 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-ovn\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86788d4c-c402-4057-988b-77279d8fd61c-ovn-node-metrics-cert\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.205459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-cni-binary-copy\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204324 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-ovn\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-host\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2267e8c5-f77e-4e17-a96f-9463ea75c147-hosts-file\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6xz\" (UniqueName: \"kubernetes.io/projected/3052a162-5d36-4309-9bd0-bca01410b715-kube-api-access-kf6xz\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-var-lib-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df6acdc8-67c6-4733-b49a-03a69f37ba5b-cni-binary-copy\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpjv4\" (UniqueName: \"kubernetes.io/projected/85c3e0af-8027-49c2-937d-99acbd5f7085-kube-api-access-gpjv4\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-var-lib-openvswitch\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-run\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-host\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204578 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-systemd\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-env-overrides\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206000 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86788d4c-c402-4057-988b-77279d8fd61c-run-systemd\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7sbg\" (UniqueName: \"kubernetes.io/projected/86788d4c-c402-4057-988b-77279d8fd61c-kube-api-access-h7sbg\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-var-lib-kubelet\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-tmp\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-cnibin\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/85c3e0af-8027-49c2-937d-99acbd5f7085-cni-binary-copy\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-netns\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-iptables-alerter-script\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-device-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-cnibin\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df6acdc8-67c6-4733-b49a-03a69f37ba5b-host-run-netns\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.204949 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86788d4c-c402-4057-988b-77279d8fd61c-env-overrides\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.205196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-iptables-alerter-script\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.206518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.205455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df6acdc8-67c6-4733-b49a-03a69f37ba5b-cni-binary-copy\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.208119 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.208100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86788d4c-c402-4057-988b-77279d8fd61c-ovn-node-metrics-cert\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.211137 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.211119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frfjh\" (UniqueName: \"kubernetes.io/projected/f5834b4f-9b4a-49b9-9cee-068a23d3d4a8-kube-api-access-frfjh\") pod \"iptables-alerter-75zzc\" (UID: \"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8\") " pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.211250 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.211135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xhw\" (UniqueName: \"kubernetes.io/projected/df6acdc8-67c6-4733-b49a-03a69f37ba5b-kube-api-access-q4xhw\") pod \"multus-tkpx8\" (UID: \"df6acdc8-67c6-4733-b49a-03a69f37ba5b\") " pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.213491 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.213454 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:04.213595 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.213497 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:04.213595 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.213511 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:04.213694 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.213637 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:04.713590244 +0000 UTC m=+2.978305641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:04.215888 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.215864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7sbg\" (UniqueName: \"kubernetes.io/projected/86788d4c-c402-4057-988b-77279d8fd61c-kube-api-access-h7sbg\") pod \"ovnkube-node-lkfqt\" (UID: \"86788d4c-c402-4057-988b-77279d8fd61c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.216522 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.216503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpjv4\" (UniqueName: \"kubernetes.io/projected/85c3e0af-8027-49c2-937d-99acbd5f7085-kube-api-access-gpjv4\") pod \"multus-additional-cni-plugins-x52vd\" (UID: \"85c3e0af-8027-49c2-937d-99acbd5f7085\") " pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.305948 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.305915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysctl-d\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.305948 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.305955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-systemd\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306176 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.305972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgvb5\" (UniqueName: \"kubernetes.io/projected/35142d06-6635-497c-8778-a5ebcc31867f-kube-api-access-hgvb5\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306176 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-systemd\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306176 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysctl-d\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306176 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysconfig\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306176 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysctl-conf\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/086451b6-0c60-4e42-8500-d8f31f29bb1e-agent-certs\") pod \"konnectivity-agent-m2nnx\" (UID: \"086451b6-0c60-4e42-8500-d8f31f29bb1e\") " pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2267e8c5-f77e-4e17-a96f-9463ea75c147-tmp-dir\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysconfig\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-sys-fs\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-serviceca\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-sysctl-conf\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-lib-modules\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-kubernetes\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-sys-fs\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.306391 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:04.306394 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-socket-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-etc-selinux\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.306466 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:04.806446217 +0000 UTC m=+3.071161615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cntsh\" (UniqueName: \"kubernetes.io/projected/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-kube-api-access-cntsh\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-kubernetes\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/086451b6-0c60-4e42-8500-d8f31f29bb1e-konnectivity-ca\") pod \"konnectivity-agent-m2nnx\" (UID: \"086451b6-0c60-4e42-8500-d8f31f29bb1e\") " pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-modprobe-d\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-lib-modules\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-sys\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-socket-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-sys\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-etc-selinux\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-tuned\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-registration-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhkn9\" (UniqueName: \"kubernetes.io/projected/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-kube-api-access-fhkn9\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.306764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hwrx\" (UniqueName: \"kubernetes.io/projected/2267e8c5-f77e-4e17-a96f-9463ea75c147-kube-api-access-4hwrx\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306811 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-host\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2267e8c5-f77e-4e17-a96f-9463ea75c147-hosts-file\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306685 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-modprobe-d\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6xz\" (UniqueName: \"kubernetes.io/projected/3052a162-5d36-4309-9bd0-bca01410b715-kube-api-access-kf6xz\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-run\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306900 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-registration-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-host\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-run\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.306949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-host\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-var-lib-kubelet\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-tmp\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307148 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2267e8c5-f77e-4e17-a96f-9463ea75c147-hosts-file\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/086451b6-0c60-4e42-8500-d8f31f29bb1e-konnectivity-ca\") pod \"konnectivity-agent-m2nnx\" (UID: \"086451b6-0c60-4e42-8500-d8f31f29bb1e\") " pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-device-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-host\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.307418 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-serviceca\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.308120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307255 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-var-lib-kubelet\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.308120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/35142d06-6635-497c-8778-a5ebcc31867f-device-dir\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.308120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.307487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2267e8c5-f77e-4e17-a96f-9463ea75c147-tmp-dir\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.308917 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.308892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/086451b6-0c60-4e42-8500-d8f31f29bb1e-agent-certs\") pod \"konnectivity-agent-m2nnx\" (UID: \"086451b6-0c60-4e42-8500-d8f31f29bb1e\") " pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.309016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.308986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-etc-tuned\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.309232 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.309208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-tmp\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.315423 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.315401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cntsh\" (UniqueName: \"kubernetes.io/projected/6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4-kube-api-access-cntsh\") pod \"node-ca-slcg8\" (UID: \"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4\") " pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.316208 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.316189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hwrx\" (UniqueName: \"kubernetes.io/projected/2267e8c5-f77e-4e17-a96f-9463ea75c147-kube-api-access-4hwrx\") pod \"node-resolver-djdhr\" (UID: \"2267e8c5-f77e-4e17-a96f-9463ea75c147\") " pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.316671 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.316648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhkn9\" (UniqueName: \"kubernetes.io/projected/0217b78c-e8f2-479d-a268-a3e3fad5f9b6-kube-api-access-fhkn9\") pod \"tuned-hz7p4\" (UID: \"0217b78c-e8f2-479d-a268-a3e3fad5f9b6\") " pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.316865 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.316850 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgvb5\" (UniqueName: \"kubernetes.io/projected/35142d06-6635-497c-8778-a5ebcc31867f-kube-api-access-hgvb5\") pod \"aws-ebs-csi-driver-node-66bmv\" (UID: \"35142d06-6635-497c-8778-a5ebcc31867f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.317866 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.317848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6xz\" (UniqueName: \"kubernetes.io/projected/3052a162-5d36-4309-9bd0-bca01410b715-kube-api-access-kf6xz\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.393380 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.393295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tkpx8" Apr 24 16:39:04.401074 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.401053 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x52vd" Apr 24 16:39:04.410654 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.410637 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-75zzc" Apr 24 16:39:04.415247 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.415226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:04.420842 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.420825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" Apr 24 16:39:04.428371 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.428354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-slcg8" Apr 24 16:39:04.433877 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.433862 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-djdhr" Apr 24 16:39:04.440386 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.440371 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:04.444863 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.444845 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" Apr 24 16:39:04.811050 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.811027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:04.811121 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:04.811078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:04.811188 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.811173 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:04.811188 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.811184 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:04.811254 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.811198 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:04.811254 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.811210 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:04.811254 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.811238 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:05.811221416 +0000 UTC m=+4.075936794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:04.811361 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:04.811255 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:05.811246651 +0000 UTC m=+4.075962026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:04.824800 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.824778 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5834b4f_9b4a_49b9_9cee_068a23d3d4a8.slice/crio-6c1fdb6ef6f8ada3d6fd3c9aa1290317e74491f19686c0f4633c02e46df89bc4 WatchSource:0}: Error finding container 6c1fdb6ef6f8ada3d6fd3c9aa1290317e74491f19686c0f4633c02e46df89bc4: Status 404 returned error can't find the container with id 6c1fdb6ef6f8ada3d6fd3c9aa1290317e74491f19686c0f4633c02e46df89bc4 Apr 24 16:39:04.825184 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.825163 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0217b78c_e8f2_479d_a268_a3e3fad5f9b6.slice/crio-6989045ecabc16169073c3f1ff46970fdf3b04dbc1a934bf18643cc0e4964e0d WatchSource:0}: Error finding container 6989045ecabc16169073c3f1ff46970fdf3b04dbc1a934bf18643cc0e4964e0d: Status 404 returned error can't find the container with id 6989045ecabc16169073c3f1ff46970fdf3b04dbc1a934bf18643cc0e4964e0d Apr 24 16:39:04.826490 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.826468 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c3e0af_8027_49c2_937d_99acbd5f7085.slice/crio-1e195bf0fb085dd6cd31c119f1fc226195f6fe3d4ff5605649e726277f6af483 WatchSource:0}: Error finding container 1e195bf0fb085dd6cd31c119f1fc226195f6fe3d4ff5605649e726277f6af483: Status 404 returned error can't find the container with id 1e195bf0fb085dd6cd31c119f1fc226195f6fe3d4ff5605649e726277f6af483 Apr 24 16:39:04.827605 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.827363 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35142d06_6635_497c_8778_a5ebcc31867f.slice/crio-adac6e3db251e727064d6e11f9469cd05f2bbf730481f21d97681c1ce8f18e4b WatchSource:0}: Error finding container adac6e3db251e727064d6e11f9469cd05f2bbf730481f21d97681c1ce8f18e4b: Status 404 returned error can't find the container with id adac6e3db251e727064d6e11f9469cd05f2bbf730481f21d97681c1ce8f18e4b Apr 24 16:39:04.828025 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.827947 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf6acdc8_67c6_4733_b49a_03a69f37ba5b.slice/crio-6cb0d27160ced1afd242c530b277779b8877494d957727df90f3a4521c21406e WatchSource:0}: Error finding container 6cb0d27160ced1afd242c530b277779b8877494d957727df90f3a4521c21406e: Status 404 returned error can't find the container with id 6cb0d27160ced1afd242c530b277779b8877494d957727df90f3a4521c21406e Apr 24 16:39:04.829077 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.829054 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86788d4c_c402_4057_988b_77279d8fd61c.slice/crio-bdd8284f8ef0048426d147eb8be5c437a62a48b75ed9e46cc7d447b487830f0f WatchSource:0}: Error finding container bdd8284f8ef0048426d147eb8be5c437a62a48b75ed9e46cc7d447b487830f0f: Status 404 returned error can't find the container with id bdd8284f8ef0048426d147eb8be5c437a62a48b75ed9e46cc7d447b487830f0f Apr 24 16:39:04.830065 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.830038 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2267e8c5_f77e_4e17_a96f_9463ea75c147.slice/crio-5850951be150aa2e707fcf2907ef9e84705b334e83ba3ceb5988633861cafa2c WatchSource:0}: Error finding container 5850951be150aa2e707fcf2907ef9e84705b334e83ba3ceb5988633861cafa2c: Status 404 returned error can't find the container with id 5850951be150aa2e707fcf2907ef9e84705b334e83ba3ceb5988633861cafa2c Apr 24 16:39:04.833519 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.833499 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd88de9_ce02_44e6_9d5c_0b5dbe13f0c4.slice/crio-268375c5a00c4f1098bfafde16a8f8e99aec48d5ebc901ffb71782cd7e5d867c WatchSource:0}: Error finding container 268375c5a00c4f1098bfafde16a8f8e99aec48d5ebc901ffb71782cd7e5d867c: Status 404 returned error can't find the container with id 268375c5a00c4f1098bfafde16a8f8e99aec48d5ebc901ffb71782cd7e5d867c Apr 24 16:39:04.833761 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:04.833741 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086451b6_0c60_4e42_8500_d8f31f29bb1e.slice/crio-20af690e7b3979e57bff1864e1753323db39db8d7d71e38349b3b1574bd359d0 WatchSource:0}: Error finding container 20af690e7b3979e57bff1864e1753323db39db8d7d71e38349b3b1574bd359d0: Status 404 returned error can't find the container with id 20af690e7b3979e57bff1864e1753323db39db8d7d71e38349b3b1574bd359d0 Apr 24 16:39:05.138990 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.138891 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:03 +0000 UTC" deadline="2027-12-06 02:14:16.413287157 +0000 UTC" Apr 24 16:39:05.138990 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.138918 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14169h35m11.274371003s" Apr 24 16:39:05.212527 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.212495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:05.212691 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.212628 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:05.220171 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.220142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" event={"ID":"0217b78c-e8f2-479d-a268-a3e3fad5f9b6","Type":"ContainerStarted","Data":"6989045ecabc16169073c3f1ff46970fdf3b04dbc1a934bf18643cc0e4964e0d"} Apr 24 16:39:05.223005 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.222961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" event={"ID":"63bf82c7cb78a8f9ba0c1c606e45e954","Type":"ContainerStarted","Data":"16085e490a3e74c5023873479023b3d8187d192ba79eeafa5fe6cec423189f6a"} Apr 24 16:39:05.224208 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.224185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-slcg8" event={"ID":"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4","Type":"ContainerStarted","Data":"268375c5a00c4f1098bfafde16a8f8e99aec48d5ebc901ffb71782cd7e5d867c"} Apr 24 16:39:05.225228 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.225208 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-djdhr" event={"ID":"2267e8c5-f77e-4e17-a96f-9463ea75c147","Type":"ContainerStarted","Data":"5850951be150aa2e707fcf2907ef9e84705b334e83ba3ceb5988633861cafa2c"} Apr 24 16:39:05.227174 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.227152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tkpx8" event={"ID":"df6acdc8-67c6-4733-b49a-03a69f37ba5b","Type":"ContainerStarted","Data":"6cb0d27160ced1afd242c530b277779b8877494d957727df90f3a4521c21406e"} Apr 24 16:39:05.230503 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.230468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-75zzc" event={"ID":"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8","Type":"ContainerStarted","Data":"6c1fdb6ef6f8ada3d6fd3c9aa1290317e74491f19686c0f4633c02e46df89bc4"} Apr 24 16:39:05.232561 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.232537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m2nnx" event={"ID":"086451b6-0c60-4e42-8500-d8f31f29bb1e","Type":"ContainerStarted","Data":"20af690e7b3979e57bff1864e1753323db39db8d7d71e38349b3b1574bd359d0"} Apr 24 16:39:05.234205 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.234161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"bdd8284f8ef0048426d147eb8be5c437a62a48b75ed9e46cc7d447b487830f0f"} Apr 24 16:39:05.237296 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.236802 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-182.ec2.internal" podStartSLOduration=2.236786565 podStartE2EDuration="2.236786565s" podCreationTimestamp="2026-04-24 16:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:05.236026858 +0000 UTC m=+3.500742255" watchObservedRunningTime="2026-04-24 16:39:05.236786565 +0000 UTC m=+3.501501962" Apr 24 16:39:05.239029 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.239010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" event={"ID":"35142d06-6635-497c-8778-a5ebcc31867f","Type":"ContainerStarted","Data":"adac6e3db251e727064d6e11f9469cd05f2bbf730481f21d97681c1ce8f18e4b"} Apr 24 16:39:05.241648 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.241625 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerStarted","Data":"1e195bf0fb085dd6cd31c119f1fc226195f6fe3d4ff5605649e726277f6af483"} Apr 24 16:39:05.818702 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.818669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:05.818805 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:05.818751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:05.818909 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.818893 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:05.818954 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.818916 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:05.818954 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.818930 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:05.819056 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.818989 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.818970485 +0000 UTC m=+6.083685866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:05.819441 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.819421 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:05.819530 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:05.819473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.819459912 +0000 UTC m=+6.084175292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:06.215169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:06.214551 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:06.215169 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:06.214685 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:06.253343 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:06.252833 2573 generic.go:358] "Generic (PLEG): container finished" podID="5d34cbf5b33a49e6ab42e122744bf511" containerID="1cf21c60b34b1a0db70a57b3d5449a1c07c4b33c7de113a2063cc9e575ac5ce7" exitCode=0 Apr 24 16:39:06.253673 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:06.253297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" event={"ID":"5d34cbf5b33a49e6ab42e122744bf511","Type":"ContainerDied","Data":"1cf21c60b34b1a0db70a57b3d5449a1c07c4b33c7de113a2063cc9e575ac5ce7"} Apr 24 16:39:07.212365 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:07.212328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:07.212540 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.212493 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:07.263928 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:07.263870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" event={"ID":"5d34cbf5b33a49e6ab42e122744bf511","Type":"ContainerStarted","Data":"7fd1b85f558cc546caefc042bc33899528f9a9b86af136ef4ee406bf26446858"} Apr 24 16:39:07.837032 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:07.836989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:07.837207 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:07.837061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:07.837207 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.837184 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.837330 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.837243 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.837224854 +0000 UTC m=+10.101940233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.837607 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.837592 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:07.837659 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.837611 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:07.837659 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.837620 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:07.837659 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:07.837651 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.837641025 +0000 UTC m=+10.102356400 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:08.215270 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:08.214782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:08.215270 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:08.214889 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:09.212256 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:09.212225 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:09.212641 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:09.212344 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:10.213951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:10.213452 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:10.213951 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:10.213601 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:11.212536 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:11.212496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:11.212723 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.212610 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:11.874389 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:11.874348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:11.874795 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:11.874424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:11.874795 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.874564 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.874795 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.874623 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:19.874602517 +0000 UTC m=+18.139317894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.875016 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.875001 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:11.875049 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.875022 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:11.875049 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.875031 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.875109 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:11.875063 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:19.875052973 +0000 UTC m=+18.139768348 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:12.213144 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:12.213057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:12.213291 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:12.213185 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:13.213444 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:13.212979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:13.213444 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:13.213103 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:14.212019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:14.211989 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:14.212208 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:14.212124 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:15.212267 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:15.212237 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:15.212713 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:15.212359 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:16.212279 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:16.212246 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:16.212748 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:16.212375 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:17.212352 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:17.212302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:17.212726 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:17.212424 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:18.212417 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:18.212373 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:18.212811 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:18.212519 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:19.212110 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:19.212078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:19.212266 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.212189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:19.927738 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:19.927694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:19.927756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.927839 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.927866 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.927867 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.927882 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.927937 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.927917918 +0000 UTC m=+34.192633301 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:19.928150 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:19.927953 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:35.9279465 +0000 UTC m=+34.192661875 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:20.212835 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:20.212763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:20.212996 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:20.212871 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:21.212534 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:21.212502 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:21.213038 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:21.212619 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:22.214345 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.214087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:22.214753 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:22.214721 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:22.291246 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.291220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"3d2520899de11bb036c53819524f64b56e26872aa21d932e0db983d9b25688de"} Apr 24 16:39:22.291363 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.291256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"2dbc4892a138da143dd3ada290d9a6552d77d59b93e0d01f758be644f1f5df07"} Apr 24 16:39:22.292559 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.292534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" event={"ID":"35142d06-6635-497c-8778-a5ebcc31867f","Type":"ContainerStarted","Data":"32a0bf43f955fcc120f3833e1a656f0b4a2e7dfa72b58162983244ed7c52b603"} Apr 24 16:39:22.294046 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.293524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerStarted","Data":"f7df629d9470807e9907b4ada3ed9551c3e3d5388a87483f93e2f5352d8613db"} Apr 24 16:39:22.294387 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.294367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" event={"ID":"0217b78c-e8f2-479d-a268-a3e3fad5f9b6","Type":"ContainerStarted","Data":"180a70f6d8bd1bd9155019b53d5de070ddcbc14fb12bdb40f29abcbded28142e"} Apr 24 16:39:22.295826 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.295493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-slcg8" event={"ID":"6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4","Type":"ContainerStarted","Data":"73eaa1afa313d38698b93cd6f1676addc389fa56b31b897e73daafa82c12def9"} Apr 24 16:39:22.296764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.296738 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-djdhr" event={"ID":"2267e8c5-f77e-4e17-a96f-9463ea75c147","Type":"ContainerStarted","Data":"1c5abfac250b4e95438590bc33f5b11c34cef5e4f82bcd364e193315afd1f122"} Apr 24 16:39:22.297690 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.297630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tkpx8" event={"ID":"df6acdc8-67c6-4733-b49a-03a69f37ba5b","Type":"ContainerStarted","Data":"dcfae91267c2ef9f3bd4b94e1ec79f560de28784a9be1da8dc932cd74ed8b3c5"} Apr 24 16:39:22.298604 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.298580 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m2nnx" event={"ID":"086451b6-0c60-4e42-8500-d8f31f29bb1e","Type":"ContainerStarted","Data":"fd29f39b2390c541e723e84c9651ac715a2ea5c609e0b35a1b50063d02e897c1"} Apr 24 16:39:22.314854 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.314815 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-182.ec2.internal" podStartSLOduration=19.314804273 podStartE2EDuration="19.314804273s" podCreationTimestamp="2026-04-24 16:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:07.280130638 +0000 UTC m=+5.544846039" watchObservedRunningTime="2026-04-24 16:39:22.314804273 +0000 UTC m=+20.579519699" Apr 24 16:39:22.331093 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.331056 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tkpx8" podStartSLOduration=3.189011664 podStartE2EDuration="20.331044948s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.830446256 +0000 UTC m=+3.095161645" lastFinishedPulling="2026-04-24 16:39:21.972479554 +0000 UTC m=+20.237194929" observedRunningTime="2026-04-24 16:39:22.330778055 +0000 UTC m=+20.595493448" watchObservedRunningTime="2026-04-24 16:39:22.331044948 +0000 UTC m=+20.595760345" Apr 24 16:39:22.354102 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.354060 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hz7p4" podStartSLOduration=3.244584167 podStartE2EDuration="20.35404854s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.826882751 +0000 UTC m=+3.091598126" lastFinishedPulling="2026-04-24 16:39:21.936347114 +0000 UTC m=+20.201062499" observedRunningTime="2026-04-24 16:39:22.35367252 +0000 UTC m=+20.618387916" watchObservedRunningTime="2026-04-24 16:39:22.35404854 +0000 UTC m=+20.618763937" Apr 24 16:39:22.368818 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.368783 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-slcg8" podStartSLOduration=3.522134021 podStartE2EDuration="20.368772179s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.835726996 +0000 UTC m=+3.100442372" lastFinishedPulling="2026-04-24 16:39:21.682365143 +0000 UTC m=+19.947080530" observedRunningTime="2026-04-24 16:39:22.368321923 +0000 UTC m=+20.633037312" watchObservedRunningTime="2026-04-24 16:39:22.368772179 +0000 UTC m=+20.633487576" Apr 24 16:39:22.382249 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:22.382211 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-djdhr" podStartSLOduration=3.4895212620000002 podStartE2EDuration="20.382200397s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.831656412 +0000 UTC m=+3.096371787" lastFinishedPulling="2026-04-24 16:39:21.724335547 +0000 UTC m=+19.989050922" observedRunningTime="2026-04-24 16:39:22.382101061 +0000 UTC m=+20.646816468" watchObservedRunningTime="2026-04-24 16:39:22.382200397 +0000 UTC m=+20.646915793" Apr 24 16:39:23.213103 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.212890 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:23.213250 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:23.213134 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:23.302922 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.302898 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:39:23.303621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.303183 2573 generic.go:358] "Generic (PLEG): container finished" podID="86788d4c-c402-4057-988b-77279d8fd61c" containerID="3d2520899de11bb036c53819524f64b56e26872aa21d932e0db983d9b25688de" exitCode=1 Apr 24 16:39:23.303621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.303244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerDied","Data":"3d2520899de11bb036c53819524f64b56e26872aa21d932e0db983d9b25688de"} Apr 24 16:39:23.303621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.303269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"e3380ac81c80ce91cf20d1540489e95f529adee087825fc6c046f61e1bc1df08"} Apr 24 16:39:23.303621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.303281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"1665bf41353a89788eb37da1dd8426ce7cb37cd03e410c89a326f3001bbce530"} Apr 24 16:39:23.303621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.303288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"244556ab7097816de6c9a5e6f9f1385c6cd14bfeceb085d485ccc83b3460cc74"} Apr 24 16:39:23.303621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.303297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"3c5d7dac16088aa69e1826842ad000f37f113458fef61f1fbd96b46e870c455f"} Apr 24 16:39:23.304339 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.304319 2573 generic.go:358] "Generic (PLEG): container finished" podID="85c3e0af-8027-49c2-937d-99acbd5f7085" containerID="f7df629d9470807e9907b4ada3ed9551c3e3d5388a87483f93e2f5352d8613db" exitCode=0 Apr 24 16:39:23.304443 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.304418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerDied","Data":"f7df629d9470807e9907b4ada3ed9551c3e3d5388a87483f93e2f5352d8613db"} Apr 24 16:39:23.326159 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.326122 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-m2nnx" podStartSLOduration=4.480294453 podStartE2EDuration="21.326109266s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.836547237 +0000 UTC m=+3.101262613" lastFinishedPulling="2026-04-24 16:39:21.682362037 +0000 UTC m=+19.947077426" observedRunningTime="2026-04-24 16:39:22.399994369 +0000 UTC m=+20.664709766" watchObservedRunningTime="2026-04-24 16:39:23.326109266 +0000 UTC m=+21.590824662" Apr 24 16:39:23.538187 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:23.538158 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:24.160753 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.160669 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:23.538173701Z","UUID":"855c5813-143c-473f-ba75-074efb4a4365","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:24.163435 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.163412 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:24.163435 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.163442 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:24.212806 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.212767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:24.212986 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:24.212879 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:24.308056 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.307989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-75zzc" event={"ID":"f5834b4f-9b4a-49b9-9cee-068a23d3d4a8","Type":"ContainerStarted","Data":"57b0d2dc2dc8a8334256ac33e9672d7c4c3a2f29b3994a93117a766c0eca0317"} Apr 24 16:39:24.309772 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.309744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" event={"ID":"35142d06-6635-497c-8778-a5ebcc31867f","Type":"ContainerStarted","Data":"2f0c1fe7c1e4720c30c0f839e39c6217f8b0d43cd86ea43b835f3e3004714ad0"} Apr 24 16:39:24.322678 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:24.322632 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-75zzc" podStartSLOduration=5.212631121 podStartE2EDuration="22.322620568s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.826074556 +0000 UTC m=+3.090789948" lastFinishedPulling="2026-04-24 16:39:21.936064018 +0000 UTC m=+20.200779395" observedRunningTime="2026-04-24 16:39:24.322495386 +0000 UTC m=+22.587210783" watchObservedRunningTime="2026-04-24 16:39:24.322620568 +0000 UTC m=+22.587335966" Apr 24 16:39:25.212052 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:25.211978 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:25.212209 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:25.212094 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:25.216045 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:25.216014 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:25.314324 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:25.314278 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:39:25.314742 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:25.314699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"a818fbcd290f987589a8c3ff00d4e8226efdd8950b44b7701a7efee0e2a3a6ac"} Apr 24 16:39:25.317061 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:25.317033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" event={"ID":"35142d06-6635-497c-8778-a5ebcc31867f","Type":"ContainerStarted","Data":"f6ab522a3b1acefa3311b460e0cd3af1484181c6307d8e25e29065423661ae24"} Apr 24 16:39:26.212979 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:26.212811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:26.213139 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:26.213085 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:27.140475 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.140449 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:27.141072 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.141039 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:27.155565 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.155516 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66bmv" podStartSLOduration=5.359268615 podStartE2EDuration="25.1555036s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.829218733 +0000 UTC m=+3.093934108" lastFinishedPulling="2026-04-24 16:39:24.625453708 +0000 UTC m=+22.890169093" observedRunningTime="2026-04-24 16:39:25.341329738 +0000 UTC m=+23.606045129" watchObservedRunningTime="2026-04-24 16:39:27.1555036 +0000 UTC m=+25.420219019" Apr 24 16:39:27.212375 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.212292 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:27.212501 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:27.212438 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:27.325137 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.324978 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:39:27.325913 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.325629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"98544876abf2bcdf18fab63d3121294ce16593bdfdc8b38b8fda28b48674227e"} Apr 24 16:39:27.325913 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.325847 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:27.326344 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.325963 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:27.326344 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.325985 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:27.326344 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.326108 2573 scope.go:117] "RemoveContainer" containerID="3d2520899de11bb036c53819524f64b56e26872aa21d932e0db983d9b25688de" Apr 24 16:39:27.326872 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.326341 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-m2nnx" Apr 24 16:39:27.343902 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.342738 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:27.345784 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:27.345494 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:28.212487 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:28.212457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:28.213074 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:28.212590 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:28.330250 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:28.330084 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:39:28.330662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:28.330631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" event={"ID":"86788d4c-c402-4057-988b-77279d8fd61c","Type":"ContainerStarted","Data":"3446a563e5cf695cdbadf0b66bdb0e1d6442a63118bb74035cf0c34dc8b3b275"} Apr 24 16:39:28.332182 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:28.332160 2573 generic.go:358] "Generic (PLEG): container finished" podID="85c3e0af-8027-49c2-937d-99acbd5f7085" containerID="f99e847369d5b80e9c92e33622d48d1eb5ba3886a86ad74da5aaefe67121451d" exitCode=0 Apr 24 16:39:28.332322 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:28.332255 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerDied","Data":"f99e847369d5b80e9c92e33622d48d1eb5ba3886a86ad74da5aaefe67121451d"} Apr 24 16:39:28.365027 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:28.364988 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" podStartSLOduration=9.258551308 podStartE2EDuration="26.364974922s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.831217047 +0000 UTC m=+3.095932436" lastFinishedPulling="2026-04-24 16:39:21.937640675 +0000 UTC m=+20.202356050" observedRunningTime="2026-04-24 16:39:28.363778728 +0000 UTC m=+26.628494125" watchObservedRunningTime="2026-04-24 16:39:28.364974922 +0000 UTC m=+26.629690319" Apr 24 16:39:29.207234 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:29.207143 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-74mjh"] Apr 24 16:39:29.207385 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:29.207273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:29.207385 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:29.207374 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:29.209887 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:29.209864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mgmm7"] Apr 24 16:39:29.209980 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:29.209964 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:29.210052 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:29.210029 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:29.335884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:29.335850 2573 generic.go:358] "Generic (PLEG): container finished" podID="85c3e0af-8027-49c2-937d-99acbd5f7085" containerID="ea1e4dfca18399bf71564ec15c81717fd9610085307348fb739d524e3e4ed832" exitCode=0 Apr 24 16:39:29.336298 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:29.335921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerDied","Data":"ea1e4dfca18399bf71564ec15c81717fd9610085307348fb739d524e3e4ed832"} Apr 24 16:39:30.339757 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:30.339719 2573 generic.go:358] "Generic (PLEG): container finished" podID="85c3e0af-8027-49c2-937d-99acbd5f7085" containerID="093be716770229820ca759ea8125d8e6e71e59f6206eff8ba7c911eca0eb26a9" exitCode=0 Apr 24 16:39:30.340113 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:30.339777 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerDied","Data":"093be716770229820ca759ea8125d8e6e71e59f6206eff8ba7c911eca0eb26a9"} Apr 24 16:39:31.212289 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:31.212256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:31.212479 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:31.212257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:31.212479 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:31.212415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:31.212479 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:31.212434 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:33.212824 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:33.212639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:33.213246 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:33.212639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:33.213246 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:33.212929 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:33.213246 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:33.212985 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:35.212628 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:35.212585 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:35.213151 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:35.212585 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:35.213151 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.212719 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:39:35.213151 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.212758 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mgmm7" podUID="4f65e195-bb0b-4b16-893b-21667f13f3a5" Apr 24 16:39:35.938120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:35.938093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:35.938267 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:35.938140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:35.938267 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.938258 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:35.938356 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.938256 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:35.938356 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.938273 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:35.938356 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.938287 2573 projected.go:194] Error preparing data for projected volume kube-api-access-f66k6 for pod openshift-network-diagnostics/network-check-target-mgmm7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:35.938356 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.938348 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:07.938327518 +0000 UTC m=+66.203042907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:35.938478 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:35.938369 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6 podName:4f65e195-bb0b-4b16-893b-21667f13f3a5 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:07.938358946 +0000 UTC m=+66.203074322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-f66k6" (UniqueName: "kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6") pod "network-check-target-mgmm7" (UID: "4f65e195-bb0b-4b16-893b-21667f13f3a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:36.034675 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.034644 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-182.ec2.internal" event="NodeReady" Apr 24 16:39:36.034827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.034806 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:36.095462 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.095431 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f6b8n"] Apr 24 16:39:36.108181 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.108157 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6b27c"] Apr 24 16:39:36.108352 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.108332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.110990 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.110970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:36.111117 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.110992 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:36.111511 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.111431 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzj8j\"" Apr 24 16:39:36.121034 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.121012 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f6b8n"] Apr 24 16:39:36.121034 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.121034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6b27c"] Apr 24 16:39:36.121201 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.121135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.123856 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.123685 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:36.123856 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.123736 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:36.123856 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.123779 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:36.123856 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.123787 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lhhc7\"" Apr 24 16:39:36.240790 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.240766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.241238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.240811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71bfbde9-7779-4c60-8a8a-0b238f76e255-tmp-dir\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.241238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.240829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxq9\" (UniqueName: \"kubernetes.io/projected/71bfbde9-7779-4c60-8a8a-0b238f76e255-kube-api-access-6vxq9\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.241238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.240912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71bfbde9-7779-4c60-8a8a-0b238f76e255-config-volume\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.241238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.240960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbdc\" (UniqueName: \"kubernetes.io/projected/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-kube-api-access-xmbdc\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.241238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.241011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.342098 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71bfbde9-7779-4c60-8a8a-0b238f76e255-tmp-dir\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.342208 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxq9\" (UniqueName: \"kubernetes.io/projected/71bfbde9-7779-4c60-8a8a-0b238f76e255-kube-api-access-6vxq9\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.342208 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71bfbde9-7779-4c60-8a8a-0b238f76e255-config-volume\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.342208 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbdc\" (UniqueName: \"kubernetes.io/projected/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-kube-api-access-xmbdc\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.342372 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.342372 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.342372 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.342361 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:36.342512 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.342424 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:36.842407094 +0000 UTC m=+35.107122469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:39:36.342512 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.342466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71bfbde9-7779-4c60-8a8a-0b238f76e255-tmp-dir\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.342512 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.342491 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:36.342636 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.342553 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:36.842535317 +0000 UTC m=+35.107250703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:39:36.343178 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.343163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71bfbde9-7779-4c60-8a8a-0b238f76e255-config-volume\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.354206 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.354183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbdc\" (UniqueName: \"kubernetes.io/projected/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-kube-api-access-xmbdc\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.354765 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.354746 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxq9\" (UniqueName: \"kubernetes.io/projected/71bfbde9-7779-4c60-8a8a-0b238f76e255-kube-api-access-6vxq9\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.355827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.355805 2573 generic.go:358] "Generic (PLEG): container finished" podID="85c3e0af-8027-49c2-937d-99acbd5f7085" containerID="0de89458583c3282d9b689fe1d66307a8c2fba678d752f5cb54c2d465b8352e8" exitCode=0 Apr 24 16:39:36.355929 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.355839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerDied","Data":"0de89458583c3282d9b689fe1d66307a8c2fba678d752f5cb54c2d465b8352e8"} Apr 24 16:39:36.845498 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.845403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:36.845498 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:36.845450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:36.845680 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.845550 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:36.845680 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.845552 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:36.845680 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.845601 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.845588392 +0000 UTC m=+36.110303767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:39:36.845680 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:36.845616 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.845609636 +0000 UTC m=+36.110325011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:39:37.212842 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.212760 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:39:37.213011 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.212760 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:39:37.215934 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.215913 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:37.216022 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.215929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:37.216022 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.215978 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:37.216130 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.216029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zln8m\"" Apr 24 16:39:37.216130 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.216050 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rvcx9\"" Apr 24 16:39:37.360173 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.360143 2573 generic.go:358] "Generic (PLEG): container finished" podID="85c3e0af-8027-49c2-937d-99acbd5f7085" containerID="8a99d7102d00ed6d23ffdadc01bebba7434212df4655d52b01e8aeae2bbc5412" exitCode=0 Apr 24 16:39:37.360740 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.360185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerDied","Data":"8a99d7102d00ed6d23ffdadc01bebba7434212df4655d52b01e8aeae2bbc5412"} Apr 24 16:39:37.853547 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.853515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:37.853692 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:37.853595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:37.853692 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:37.853661 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:37.853692 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:37.853670 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:37.853788 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:37.853730 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.85371021 +0000 UTC m=+38.118425604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:39:37.853788 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:37.853748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.853740102 +0000 UTC m=+38.118455479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:39:38.364758 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:38.364725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x52vd" event={"ID":"85c3e0af-8027-49c2-937d-99acbd5f7085","Type":"ContainerStarted","Data":"f246d05cb1030fa414eefcc79cd0fe4358be24de6a7382b1291db466b815dc37"} Apr 24 16:39:38.387941 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:38.387894 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x52vd" podStartSLOduration=5.357866929 podStartE2EDuration="36.387879991s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:39:04.828629214 +0000 UTC m=+3.093344592" lastFinishedPulling="2026-04-24 16:39:35.858642251 +0000 UTC m=+34.123357654" observedRunningTime="2026-04-24 16:39:38.387475601 +0000 UTC m=+36.652191008" watchObservedRunningTime="2026-04-24 16:39:38.387879991 +0000 UTC m=+36.652595387" Apr 24 16:39:39.868947 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:39.868913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:39.868947 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:39.868950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:39.869334 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:39.869035 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:39.869334 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:39.869040 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:39.869334 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:39.869090 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:43.869075229 +0000 UTC m=+42.133790604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:39:39.869334 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:39.869108 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:43.869096578 +0000 UTC m=+42.133811953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:39:43.897973 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:43.897932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:43.898371 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:43.897978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:43.898371 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:43.898091 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:43.898371 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:43.898100 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:43.898371 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:43.898157 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:51.898138892 +0000 UTC m=+50.162854267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:39:43.898371 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:43.898171 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:51.898165418 +0000 UTC m=+50.162880793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:39:51.951696 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:51.951658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:39:51.951696 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:51.951696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:39:51.952226 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:51.951793 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:51.952226 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:51.951796 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:51.952226 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:51.951843 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:07.951829364 +0000 UTC m=+66.216544739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:39:51.952226 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:39:51.951855 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:40:07.95184964 +0000 UTC m=+66.216565015 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:39:54.054636 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.054602 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn"] Apr 24 16:39:54.093053 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.093022 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn"] Apr 24 16:39:54.093188 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.093130 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.096767 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.096743 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 16:39:54.096767 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.096755 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:39:54.096966 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.096807 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fbh2m\"" Apr 24 16:39:54.096966 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.096883 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:39:54.097139 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.097123 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:39:54.123754 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.123736 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl"] Apr 24 16:39:54.141202 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.141168 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl"] Apr 24 16:39:54.141321 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.141271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.143699 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.143677 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 16:39:54.143798 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.143741 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 16:39:54.143798 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.143755 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 16:39:54.144009 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.143989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 16:39:54.165633 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.165611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ncgx\" (UniqueName: \"kubernetes.io/projected/65ad11ad-8a1a-4936-b94e-aba36a8bc166-kube-api-access-7ncgx\") pod \"managed-serviceaccount-addon-agent-7445d97c6f-6zhnn\" (UID: \"65ad11ad-8a1a-4936-b94e-aba36a8bc166\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.165764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.165677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/65ad11ad-8a1a-4936-b94e-aba36a8bc166-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7445d97c6f-6zhnn\" (UID: \"65ad11ad-8a1a-4936-b94e-aba36a8bc166\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.266827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.266800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-hub\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.266827 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.266829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbplc\" (UniqueName: \"kubernetes.io/projected/1486bd60-1c32-4e9b-a771-ae9a78a6a370-kube-api-access-rbplc\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.266972 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.266865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-ca\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.266972 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.266889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.266972 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.266921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/65ad11ad-8a1a-4936-b94e-aba36a8bc166-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7445d97c6f-6zhnn\" (UID: \"65ad11ad-8a1a-4936-b94e-aba36a8bc166\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.267062 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.266975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1486bd60-1c32-4e9b-a771-ae9a78a6a370-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.267062 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.267008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ncgx\" (UniqueName: \"kubernetes.io/projected/65ad11ad-8a1a-4936-b94e-aba36a8bc166-kube-api-access-7ncgx\") pod \"managed-serviceaccount-addon-agent-7445d97c6f-6zhnn\" (UID: \"65ad11ad-8a1a-4936-b94e-aba36a8bc166\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.267062 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.267038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.269818 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.269788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/65ad11ad-8a1a-4936-b94e-aba36a8bc166-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7445d97c6f-6zhnn\" (UID: \"65ad11ad-8a1a-4936-b94e-aba36a8bc166\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.275120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.275100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ncgx\" (UniqueName: \"kubernetes.io/projected/65ad11ad-8a1a-4936-b94e-aba36a8bc166-kube-api-access-7ncgx\") pod \"managed-serviceaccount-addon-agent-7445d97c6f-6zhnn\" (UID: \"65ad11ad-8a1a-4936-b94e-aba36a8bc166\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.367518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.367449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1486bd60-1c32-4e9b-a771-ae9a78a6a370-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.367518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.367490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.367518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.367511 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-hub\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.367751 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.367526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbplc\" (UniqueName: \"kubernetes.io/projected/1486bd60-1c32-4e9b-a771-ae9a78a6a370-kube-api-access-rbplc\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.367751 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.367554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-ca\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.367751 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.367573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.368433 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.368277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1486bd60-1c32-4e9b-a771-ae9a78a6a370-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.370074 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.370043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-hub\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.370197 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.370174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-ca\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.370258 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.370230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.370299 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.370276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1486bd60-1c32-4e9b-a771-ae9a78a6a370-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.375123 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.375105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbplc\" (UniqueName: \"kubernetes.io/projected/1486bd60-1c32-4e9b-a771-ae9a78a6a370-kube-api-access-rbplc\") pod \"cluster-proxy-proxy-agent-86d5c5cb54-xhkzl\" (UID: \"1486bd60-1c32-4e9b-a771-ae9a78a6a370\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.409721 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.409702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" Apr 24 16:39:54.449488 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.449458 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:39:54.542096 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.542048 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn"] Apr 24 16:39:54.545081 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:54.545045 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ad11ad_8a1a_4936_b94e_aba36a8bc166.slice/crio-0733e62b4c079a3b882a9c92c045324350cd0aae2dc7f0634252100a4017905b WatchSource:0}: Error finding container 0733e62b4c079a3b882a9c92c045324350cd0aae2dc7f0634252100a4017905b: Status 404 returned error can't find the container with id 0733e62b4c079a3b882a9c92c045324350cd0aae2dc7f0634252100a4017905b Apr 24 16:39:54.580712 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:54.580684 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl"] Apr 24 16:39:54.583889 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:39:54.583864 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1486bd60_1c32_4e9b_a771_ae9a78a6a370.slice/crio-c43357b4388654289a90388659b3dc14c65885787a8fa4230348bde56b93e731 WatchSource:0}: Error finding container c43357b4388654289a90388659b3dc14c65885787a8fa4230348bde56b93e731: Status 404 returned error can't find the container with id c43357b4388654289a90388659b3dc14c65885787a8fa4230348bde56b93e731 Apr 24 16:39:55.399344 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:55.399285 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" event={"ID":"1486bd60-1c32-4e9b-a771-ae9a78a6a370","Type":"ContainerStarted","Data":"c43357b4388654289a90388659b3dc14c65885787a8fa4230348bde56b93e731"} Apr 24 16:39:55.400714 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:55.400681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" event={"ID":"65ad11ad-8a1a-4936-b94e-aba36a8bc166","Type":"ContainerStarted","Data":"0733e62b4c079a3b882a9c92c045324350cd0aae2dc7f0634252100a4017905b"} Apr 24 16:39:59.346822 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:59.346794 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkfqt" Apr 24 16:39:59.409432 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:59.409392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" event={"ID":"1486bd60-1c32-4e9b-a771-ae9a78a6a370","Type":"ContainerStarted","Data":"7986006a570e728b4a5374e0013c8e249de2f7000e262022f4bfe2579795c83f"} Apr 24 16:39:59.410659 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:59.410632 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" event={"ID":"65ad11ad-8a1a-4936-b94e-aba36a8bc166","Type":"ContainerStarted","Data":"5915830a248e8ff75be1207935ed402d1b85c8dfec020574408821c1dfe978dc"} Apr 24 16:39:59.432849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:39:59.432802 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7445d97c6f-6zhnn" podStartSLOduration=1.302794391 podStartE2EDuration="5.432789788s" podCreationTimestamp="2026-04-24 16:39:54 +0000 UTC" firstStartedPulling="2026-04-24 16:39:54.547240427 +0000 UTC m=+52.811955802" lastFinishedPulling="2026-04-24 16:39:58.677235805 +0000 UTC m=+56.941951199" observedRunningTime="2026-04-24 16:39:59.432467226 +0000 UTC m=+57.697182622" watchObservedRunningTime="2026-04-24 16:39:59.432789788 +0000 UTC m=+57.697505184" Apr 24 16:40:01.415949 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:01.415912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" event={"ID":"1486bd60-1c32-4e9b-a771-ae9a78a6a370","Type":"ContainerStarted","Data":"27e12c0be53812b91e0525138b914f95100922458886862194854fe367325c6e"} Apr 24 16:40:01.415949 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:01.415950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" event={"ID":"1486bd60-1c32-4e9b-a771-ae9a78a6a370","Type":"ContainerStarted","Data":"a31483d214cf1aa0ddff961697333536b14e49907b6751a6c107d958f6b5f90d"} Apr 24 16:40:01.450248 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:01.450172 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" podStartSLOduration=1.182732865 podStartE2EDuration="7.450153056s" podCreationTimestamp="2026-04-24 16:39:54 +0000 UTC" firstStartedPulling="2026-04-24 16:39:54.585564377 +0000 UTC m=+52.850279753" lastFinishedPulling="2026-04-24 16:40:00.852984565 +0000 UTC m=+59.117699944" observedRunningTime="2026-04-24 16:40:01.448393023 +0000 UTC m=+59.713108421" watchObservedRunningTime="2026-04-24 16:40:01.450153056 +0000 UTC m=+59.714868452" Apr 24 16:40:07.969941 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.969898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.969963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.969991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.970011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:07.970116 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:07.970166 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:07.970182 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:39.970169925 +0000 UTC m=+98.234885300 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:40:07.970488 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:07.970263 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:40:39.970242847 +0000 UTC m=+98.234958230 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:40:07.972830 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.972808 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:40:07.972909 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.972892 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:40:07.980701 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:07.980678 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:40:07.980794 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:07.980748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:11.980734345 +0000 UTC m=+130.245449720 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : secret "metrics-daemon-secret" not found Apr 24 16:40:07.982630 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.982612 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:40:07.993117 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:07.993091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66k6\" (UniqueName: \"kubernetes.io/projected/4f65e195-bb0b-4b16-893b-21667f13f3a5-kube-api-access-f66k6\") pod \"network-check-target-mgmm7\" (UID: \"4f65e195-bb0b-4b16-893b-21667f13f3a5\") " pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:40:08.136124 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:08.136093 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rvcx9\"" Apr 24 16:40:08.144114 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:08.144093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:40:08.257604 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:08.257575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mgmm7"] Apr 24 16:40:08.262119 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:40:08.262090 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f65e195_bb0b_4b16_893b_21667f13f3a5.slice/crio-5c8171b8a2aaed0ba3e16ca8f3673228dd460b17d46a50e8485d8dac4768f4ec WatchSource:0}: Error finding container 5c8171b8a2aaed0ba3e16ca8f3673228dd460b17d46a50e8485d8dac4768f4ec: Status 404 returned error can't find the container with id 5c8171b8a2aaed0ba3e16ca8f3673228dd460b17d46a50e8485d8dac4768f4ec Apr 24 16:40:08.429577 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:08.429547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mgmm7" event={"ID":"4f65e195-bb0b-4b16-893b-21667f13f3a5","Type":"ContainerStarted","Data":"5c8171b8a2aaed0ba3e16ca8f3673228dd460b17d46a50e8485d8dac4768f4ec"} Apr 24 16:40:11.439255 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:11.439222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mgmm7" event={"ID":"4f65e195-bb0b-4b16-893b-21667f13f3a5","Type":"ContainerStarted","Data":"83f34f2e16c7f3eb4af39630fbe4984829d6e80b6f618b57d040e5500ed4cb14"} Apr 24 16:40:11.439610 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:11.439386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:40:11.455239 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:11.455192 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mgmm7" podStartSLOduration=66.929517509 podStartE2EDuration="1m9.455179513s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:40:08.264263397 +0000 UTC m=+66.528978773" lastFinishedPulling="2026-04-24 16:40:10.789925385 +0000 UTC m=+69.054640777" observedRunningTime="2026-04-24 16:40:11.454993923 +0000 UTC m=+69.719709334" watchObservedRunningTime="2026-04-24 16:40:11.455179513 +0000 UTC m=+69.719894914" Apr 24 16:40:39.984232 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:39.984193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:40:39.984740 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:39.984280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:40:39.984740 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:39.984360 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:39.984740 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:39.984403 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:39.984740 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:39.984428 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls podName:71bfbde9-7779-4c60-8a8a-0b238f76e255 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:43.984410564 +0000 UTC m=+162.249125939 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls") pod "dns-default-f6b8n" (UID: "71bfbde9-7779-4c60-8a8a-0b238f76e255") : secret "dns-default-metrics-tls" not found Apr 24 16:40:39.984740 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:40:39.984452 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert podName:8ae78d18-eee9-4ff6-b5b1-81a6bd62493c nodeName:}" failed. No retries permitted until 2026-04-24 16:41:43.98443688 +0000 UTC m=+162.249152255 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert") pod "ingress-canary-6b27c" (UID: "8ae78d18-eee9-4ff6-b5b1-81a6bd62493c") : secret "canary-serving-cert" not found Apr 24 16:40:42.444849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:40:42.444817 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mgmm7" Apr 24 16:41:06.538209 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:06.538182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-djdhr_2267e8c5-f77e-4e17-a96f-9463ea75c147/dns-node-resolver/0.log" Apr 24 16:41:07.339036 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:07.339012 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-slcg8_6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4/node-ca/0.log" Apr 24 16:41:11.997791 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:11.997750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:41:11.998375 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:11.997890 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:41:11.998375 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:11.997976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs podName:3052a162-5d36-4309-9bd0-bca01410b715 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:13.99795466 +0000 UTC m=+252.262670036 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs") pod "network-metrics-daemon-74mjh" (UID: "3052a162-5d36-4309-9bd0-bca01410b715") : secret "metrics-daemon-secret" not found Apr 24 16:41:17.731966 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.731928 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8whz2"] Apr 24 16:41:17.733716 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.733701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.735837 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.735809 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:41:17.736703 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.736684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:41:17.736779 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.736720 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:41:17.736779 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.736688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8vx6n\"" Apr 24 16:41:17.737083 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.737062 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:41:17.743215 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.743191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8whz2"] Apr 24 16:41:17.840237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.840200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bc890b75-e729-4d8c-8e1d-05bc27ad8717-crio-socket\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.840237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.840248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp42w\" (UniqueName: \"kubernetes.io/projected/bc890b75-e729-4d8c-8e1d-05bc27ad8717-kube-api-access-kp42w\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.840505 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.840366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bc890b75-e729-4d8c-8e1d-05bc27ad8717-data-volume\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.840505 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.840412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bc890b75-e729-4d8c-8e1d-05bc27ad8717-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.840505 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.840465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bc890b75-e729-4d8c-8e1d-05bc27ad8717-data-volume\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941507 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bc890b75-e729-4d8c-8e1d-05bc27ad8717-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941507 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941507 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bc890b75-e729-4d8c-8e1d-05bc27ad8717-crio-socket\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941507 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp42w\" (UniqueName: \"kubernetes.io/projected/bc890b75-e729-4d8c-8e1d-05bc27ad8717-kube-api-access-kp42w\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941714 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bc890b75-e729-4d8c-8e1d-05bc27ad8717-crio-socket\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941714 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:17.941581 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:17.941714 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bc890b75-e729-4d8c-8e1d-05bc27ad8717-data-volume\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.941714 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:17.941674 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls podName:bc890b75-e729-4d8c-8e1d-05bc27ad8717 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:18.441650477 +0000 UTC m=+136.706365856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8whz2" (UID: "bc890b75-e729-4d8c-8e1d-05bc27ad8717") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:17.941946 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.941928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bc890b75-e729-4d8c-8e1d-05bc27ad8717-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:17.951629 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:17.951603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp42w\" (UniqueName: \"kubernetes.io/projected/bc890b75-e729-4d8c-8e1d-05bc27ad8717-kube-api-access-kp42w\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:18.445203 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:18.445162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:18.445398 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:18.445350 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:18.445447 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:18.445434 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls podName:bc890b75-e729-4d8c-8e1d-05bc27ad8717 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:19.44541693 +0000 UTC m=+137.710132306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8whz2" (UID: "bc890b75-e729-4d8c-8e1d-05bc27ad8717") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:19.451462 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:19.451403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:19.451881 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:19.451546 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:19.451881 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:19.451618 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls podName:bc890b75-e729-4d8c-8e1d-05bc27ad8717 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:21.451603998 +0000 UTC m=+139.716319373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8whz2" (UID: "bc890b75-e729-4d8c-8e1d-05bc27ad8717") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.465071 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:21.465035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:21.465476 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:21.465162 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.465476 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:21.465225 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls podName:bc890b75-e729-4d8c-8e1d-05bc27ad8717 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:25.465212271 +0000 UTC m=+143.729927646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8whz2" (UID: "bc890b75-e729-4d8c-8e1d-05bc27ad8717") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:25.494347 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:25.494290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:25.494754 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:25.494419 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:25.494754 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:25.494505 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls podName:bc890b75-e729-4d8c-8e1d-05bc27ad8717 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:33.494483887 +0000 UTC m=+151.759199279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8whz2" (UID: "bc890b75-e729-4d8c-8e1d-05bc27ad8717") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:33.553900 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:33.553864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:33.556227 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:33.556188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bc890b75-e729-4d8c-8e1d-05bc27ad8717-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8whz2\" (UID: \"bc890b75-e729-4d8c-8e1d-05bc27ad8717\") " pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:33.642513 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:33.642481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8whz2" Apr 24 16:41:33.754781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:33.754751 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8whz2"] Apr 24 16:41:33.758988 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:33.758964 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc890b75_e729_4d8c_8e1d_05bc27ad8717.slice/crio-2ccd2fa81046f48579c3f5fa846c564873764a5486194dab96bb74d094c195a6 WatchSource:0}: Error finding container 2ccd2fa81046f48579c3f5fa846c564873764a5486194dab96bb74d094c195a6: Status 404 returned error can't find the container with id 2ccd2fa81046f48579c3f5fa846c564873764a5486194dab96bb74d094c195a6 Apr 24 16:41:34.629951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:34.629863 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8whz2" event={"ID":"bc890b75-e729-4d8c-8e1d-05bc27ad8717","Type":"ContainerStarted","Data":"b29b69f21de972f01b38a57fb3236d8327322f55ae4b45a459bfdb0678d8142c"} Apr 24 16:41:34.629951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:34.629902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8whz2" event={"ID":"bc890b75-e729-4d8c-8e1d-05bc27ad8717","Type":"ContainerStarted","Data":"2708f45bc78a05f60032dfd62f3c225670e24d176123e89592574763d4088a58"} Apr 24 16:41:34.629951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:34.629912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8whz2" event={"ID":"bc890b75-e729-4d8c-8e1d-05bc27ad8717","Type":"ContainerStarted","Data":"2ccd2fa81046f48579c3f5fa846c564873764a5486194dab96bb74d094c195a6"} Apr 24 16:41:36.636392 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:36.636350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8whz2" event={"ID":"bc890b75-e729-4d8c-8e1d-05bc27ad8717","Type":"ContainerStarted","Data":"758fc456ed669ffe7bbf00d782a1d0835dd4029c4698016dcda91acf90896327"} Apr 24 16:41:36.657036 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:36.656990 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8whz2" podStartSLOduration=17.817714253 podStartE2EDuration="19.656974363s" podCreationTimestamp="2026-04-24 16:41:17 +0000 UTC" firstStartedPulling="2026-04-24 16:41:33.81051277 +0000 UTC m=+152.075228146" lastFinishedPulling="2026-04-24 16:41:35.649772861 +0000 UTC m=+153.914488256" observedRunningTime="2026-04-24 16:41:36.6562074 +0000 UTC m=+154.920922798" watchObservedRunningTime="2026-04-24 16:41:36.656974363 +0000 UTC m=+154.921689751" Apr 24 16:41:37.462930 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.462897 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54d688d7d7-nlm52"] Apr 24 16:41:37.464789 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.464774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.467132 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:37.467096 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-tls\" is forbidden: User \"system:node:ip-10-0-142-182.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-142-182.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" type="*v1.Secret" Apr 24 16:41:37.467273 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:37.467144 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"installation-pull-secrets\" is forbidden: User \"system:node:ip-10-0-142-182.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-142-182.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" type="*v1.Secret" Apr 24 16:41:37.467371 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.467282 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6j55m\"" Apr 24 16:41:37.468951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.468932 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:41:37.475816 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.475797 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:41:37.490443 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.490421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54d688d7d7-nlm52"] Apr 24 16:41:37.586565 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/185b1c1c-1ad5-4891-831d-b68d86e99611-ca-trust-extracted\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586723 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/185b1c1c-1ad5-4891-831d-b68d86e99611-image-registry-private-configuration\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586723 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/185b1c1c-1ad5-4891-831d-b68d86e99611-installation-pull-secrets\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586723 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n58fs\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-kube-api-access-n58fs\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586723 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-registry-tls\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586872 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/185b1c1c-1ad5-4891-831d-b68d86e99611-trusted-ca\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586872 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586750 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-bound-sa-token\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.586872 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.586814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/185b1c1c-1ad5-4891-831d-b68d86e99611-registry-certificates\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.687929 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.687901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-bound-sa-token\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.687943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/185b1c1c-1ad5-4891-831d-b68d86e99611-registry-certificates\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/185b1c1c-1ad5-4891-831d-b68d86e99611-ca-trust-extracted\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/185b1c1c-1ad5-4891-831d-b68d86e99611-image-registry-private-configuration\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/185b1c1c-1ad5-4891-831d-b68d86e99611-installation-pull-secrets\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n58fs\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-kube-api-access-n58fs\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-registry-tls\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688377 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/185b1c1c-1ad5-4891-831d-b68d86e99611-trusted-ca\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688737 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688448 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/185b1c1c-1ad5-4891-831d-b68d86e99611-ca-trust-extracted\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.688737 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.688719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/185b1c1c-1ad5-4891-831d-b68d86e99611-registry-certificates\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.689212 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.689194 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/185b1c1c-1ad5-4891-831d-b68d86e99611-trusted-ca\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.690591 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.690571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/185b1c1c-1ad5-4891-831d-b68d86e99611-image-registry-private-configuration\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.712427 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.712401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-bound-sa-token\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:37.712770 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:37.712751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n58fs\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-kube-api-access-n58fs\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:38.273145 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:38.273116 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:41:38.281258 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:38.281228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/185b1c1c-1ad5-4891-831d-b68d86e99611-registry-tls\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:38.427245 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:38.427210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:41:38.431260 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:38.431242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/185b1c1c-1ad5-4891-831d-b68d86e99611-installation-pull-secrets\") pod \"image-registry-54d688d7d7-nlm52\" (UID: \"185b1c1c-1ad5-4891-831d-b68d86e99611\") " pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:38.674145 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:38.674045 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:38.793936 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:38.793906 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54d688d7d7-nlm52"] Apr 24 16:41:38.796944 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:38.796917 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod185b1c1c_1ad5_4891_831d_b68d86e99611.slice/crio-02c77f46641b88a0adacd19ccba0964cc69fe8e568793e831fdeb7732c1d55aa WatchSource:0}: Error finding container 02c77f46641b88a0adacd19ccba0964cc69fe8e568793e831fdeb7732c1d55aa: Status 404 returned error can't find the container with id 02c77f46641b88a0adacd19ccba0964cc69fe8e568793e831fdeb7732c1d55aa Apr 24 16:41:39.123453 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:39.123408 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-f6b8n" podUID="71bfbde9-7779-4c60-8a8a-0b238f76e255" Apr 24 16:41:39.176046 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:39.176006 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6b27c" podUID="8ae78d18-eee9-4ff6-b5b1-81a6bd62493c" Apr 24 16:41:39.647515 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:39.647486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f6b8n" Apr 24 16:41:39.647698 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:39.647483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" event={"ID":"185b1c1c-1ad5-4891-831d-b68d86e99611","Type":"ContainerStarted","Data":"4da8f7425e552a47123fa9c93350717897811ad8032d648d3f66ed25927c2c1f"} Apr 24 16:41:39.647764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:39.647692 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:41:39.647764 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:39.647717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" event={"ID":"185b1c1c-1ad5-4891-831d-b68d86e99611","Type":"ContainerStarted","Data":"02c77f46641b88a0adacd19ccba0964cc69fe8e568793e831fdeb7732c1d55aa"} Apr 24 16:41:39.670573 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:39.670533 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" podStartSLOduration=2.670521765 podStartE2EDuration="2.670521765s" podCreationTimestamp="2026-04-24 16:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:39.669648353 +0000 UTC m=+157.934363753" watchObservedRunningTime="2026-04-24 16:41:39.670521765 +0000 UTC m=+157.935237160" Apr 24 16:41:40.226492 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:40.226455 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-74mjh" podUID="3052a162-5d36-4309-9bd0-bca01410b715" Apr 24 16:41:44.035968 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.035912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:41:44.035968 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.035973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:41:44.038569 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.038539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71bfbde9-7779-4c60-8a8a-0b238f76e255-metrics-tls\") pod \"dns-default-f6b8n\" (UID: \"71bfbde9-7779-4c60-8a8a-0b238f76e255\") " pod="openshift-dns/dns-default-f6b8n" Apr 24 16:41:44.038676 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.038614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ae78d18-eee9-4ff6-b5b1-81a6bd62493c-cert\") pod \"ingress-canary-6b27c\" (UID: \"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c\") " pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:41:44.151142 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.151104 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzj8j\"" Apr 24 16:41:44.159684 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.159656 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f6b8n" Apr 24 16:41:44.279088 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.279046 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f6b8n"] Apr 24 16:41:44.283041 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:44.283013 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bfbde9_7779_4c60_8a8a_0b238f76e255.slice/crio-ebf3df44f8c28b875f03748cca5bfed32238d874906b7419086e2295e2b35874 WatchSource:0}: Error finding container ebf3df44f8c28b875f03748cca5bfed32238d874906b7419086e2295e2b35874: Status 404 returned error can't find the container with id ebf3df44f8c28b875f03748cca5bfed32238d874906b7419086e2295e2b35874 Apr 24 16:41:44.660093 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.659998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f6b8n" event={"ID":"71bfbde9-7779-4c60-8a8a-0b238f76e255","Type":"ContainerStarted","Data":"ebf3df44f8c28b875f03748cca5bfed32238d874906b7419086e2295e2b35874"} Apr 24 16:41:44.746461 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.746424 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f"] Apr 24 16:41:44.751738 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.751707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.770353 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.770324 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:41:44.770353 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.770347 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:41:44.770529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.770410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:41:44.770529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.770327 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:41:44.770529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.770411 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-pvzc7\"" Apr 24 16:41:44.770529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.770347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 16:41:44.775509 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.775485 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f"] Apr 24 16:41:44.810038 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.810001 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-65ts2"] Apr 24 16:41:44.812086 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.812063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.815983 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.815964 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:41:44.816424 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.816251 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:41:44.816424 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.816418 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sv478\"" Apr 24 16:41:44.816688 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.816422 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:41:44.841857 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.841827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.842021 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.841878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460b36bf-3b60-407d-a1f4-a6660b7cb22f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.842021 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.841901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.842021 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.841964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ctrz\" (UniqueName: \"kubernetes.io/projected/460b36bf-3b60-407d-a1f4-a6660b7cb22f-kube-api-access-9ctrz\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.943110 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.943110 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460b36bf-3b60-407d-a1f4-a6660b7cb22f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.943360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-wtmp\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-metrics-client-ca\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.943360 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:44.943196 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 16:41:44.943360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-textfile\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943549 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:41:44.943363 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-tls podName:460b36bf-3b60-407d-a1f4-a6660b7cb22f nodeName:}" failed. No retries permitted until 2026-04-24 16:41:45.443338854 +0000 UTC m=+163.708054234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-pzh2f" (UID: "460b36bf-3b60-407d-a1f4-a6660b7cb22f") : secret "openshift-state-metrics-tls" not found Apr 24 16:41:44.943549 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-tls\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943549 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ctrz\" (UniqueName: \"kubernetes.io/projected/460b36bf-3b60-407d-a1f4-a6660b7cb22f-kube-api-access-9ctrz\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.943549 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943549 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943833 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943563 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-root\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943833 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-956z8\" (UniqueName: \"kubernetes.io/projected/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-kube-api-access-956z8\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.943833 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.943611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-sys\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:44.944592 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.944565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460b36bf-3b60-407d-a1f4-a6660b7cb22f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.946558 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.946536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:44.952945 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:44.952917 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ctrz\" (UniqueName: \"kubernetes.io/projected/460b36bf-3b60-407d-a1f4-a6660b7cb22f-kube-api-access-9ctrz\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:45.044359 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-root\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-956z8\" (UniqueName: \"kubernetes.io/projected/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-kube-api-access-956z8\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-sys\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-wtmp\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-metrics-client-ca\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-textfile\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.044761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-tls\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.045137 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.044971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-root\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.045137 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.045003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-wtmp\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.045137 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.045030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-sys\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.045774 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.045622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.045987 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.045951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-metrics-client-ca\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.046655 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.046629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-textfile\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.047553 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.047521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-tls\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.047706 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.047686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.053829 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.053805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-956z8\" (UniqueName: \"kubernetes.io/projected/cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a-kube-api-access-956z8\") pod \"node-exporter-65ts2\" (UID: \"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a\") " pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.122791 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.122753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-65ts2" Apr 24 16:41:45.445321 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:45.445272 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb5b0f1f_7a2b_4ce1_ad76_bcc4f3aa2a4a.slice/crio-5cb43a008cb1bdcb436efdfe0b45f371b1800e1c981df20d279b95c37173e963 WatchSource:0}: Error finding container 5cb43a008cb1bdcb436efdfe0b45f371b1800e1c981df20d279b95c37173e963: Status 404 returned error can't find the container with id 5cb43a008cb1bdcb436efdfe0b45f371b1800e1c981df20d279b95c37173e963 Apr 24 16:41:45.447178 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.447151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:45.449784 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.449760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460b36bf-3b60-407d-a1f4-a6660b7cb22f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pzh2f\" (UID: \"460b36bf-3b60-407d-a1f4-a6660b7cb22f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:45.662238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.662204 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" Apr 24 16:41:45.663771 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.663739 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65ts2" event={"ID":"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a","Type":"ContainerStarted","Data":"5cb43a008cb1bdcb436efdfe0b45f371b1800e1c981df20d279b95c37173e963"} Apr 24 16:41:45.665196 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.665165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f6b8n" event={"ID":"71bfbde9-7779-4c60-8a8a-0b238f76e255","Type":"ContainerStarted","Data":"e7c22423382e4300f8f8914f9a7efcbdd93cd2903c26f8d444ddb5abef8033ad"} Apr 24 16:41:45.783647 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:45.783614 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f"] Apr 24 16:41:45.787459 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:45.787426 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460b36bf_3b60_407d_a1f4_a6660b7cb22f.slice/crio-6eb8f178489a4e3cbd9e0e9f081c96b8d475fa33cd34ad774bf0d5aa899902ce WatchSource:0}: Error finding container 6eb8f178489a4e3cbd9e0e9f081c96b8d475fa33cd34ad774bf0d5aa899902ce: Status 404 returned error can't find the container with id 6eb8f178489a4e3cbd9e0e9f081c96b8d475fa33cd34ad774bf0d5aa899902ce Apr 24 16:41:46.669803 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.669764 2573 generic.go:358] "Generic (PLEG): container finished" podID="cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a" containerID="06b587409034312f2c97d34e9a7ca7e82e3bcaa941d3089c6c4b38394244dc84" exitCode=0 Apr 24 16:41:46.670269 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.669866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65ts2" event={"ID":"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a","Type":"ContainerDied","Data":"06b587409034312f2c97d34e9a7ca7e82e3bcaa941d3089c6c4b38394244dc84"} Apr 24 16:41:46.671666 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.671635 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f6b8n" event={"ID":"71bfbde9-7779-4c60-8a8a-0b238f76e255","Type":"ContainerStarted","Data":"89fa97d5a6c4b93b76984e4dfca7882c3efa5c77f8b37abc728238719648ef90"} Apr 24 16:41:46.671790 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.671765 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-f6b8n" Apr 24 16:41:46.673463 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.673438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" event={"ID":"460b36bf-3b60-407d-a1f4-a6660b7cb22f","Type":"ContainerStarted","Data":"97b2599719b87f4c343053a9c60bf23d21e28e147b87a19784adec17a17cad8f"} Apr 24 16:41:46.673590 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.673469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" event={"ID":"460b36bf-3b60-407d-a1f4-a6660b7cb22f","Type":"ContainerStarted","Data":"586a8779bc78946921135058a8d412e1577fd78d93862d5fbfb7f1322472717a"} Apr 24 16:41:46.673590 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.673484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" event={"ID":"460b36bf-3b60-407d-a1f4-a6660b7cb22f","Type":"ContainerStarted","Data":"6eb8f178489a4e3cbd9e0e9f081c96b8d475fa33cd34ad774bf0d5aa899902ce"} Apr 24 16:41:46.707912 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:46.707826 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f6b8n" podStartSLOduration=129.529926776 podStartE2EDuration="2m10.707807266s" podCreationTimestamp="2026-04-24 16:39:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:44.28487388 +0000 UTC m=+162.549589269" lastFinishedPulling="2026-04-24 16:41:45.462754379 +0000 UTC m=+163.727469759" observedRunningTime="2026-04-24 16:41:46.707728574 +0000 UTC m=+164.972443978" watchObservedRunningTime="2026-04-24 16:41:46.707807266 +0000 UTC m=+164.972522658" Apr 24 16:41:47.677737 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:47.677692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65ts2" event={"ID":"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a","Type":"ContainerStarted","Data":"4b3fc13e9b1de968c18b68def56244bab0bf01d8e4fa89c359878d98fd156b65"} Apr 24 16:41:47.678136 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:47.677742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-65ts2" event={"ID":"cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a","Type":"ContainerStarted","Data":"166a4faaf80f4b37afa428b914836d192b4beb6bc95352c9a33f1b752b5724c8"} Apr 24 16:41:47.679450 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:47.679424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" event={"ID":"460b36bf-3b60-407d-a1f4-a6660b7cb22f","Type":"ContainerStarted","Data":"ded24633505481b4c49313f37131c32ed5ee9ebec0b84218fc944ec6d988db2c"} Apr 24 16:41:47.701620 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:47.701579 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-65ts2" podStartSLOduration=3.044202695 podStartE2EDuration="3.701566632s" podCreationTimestamp="2026-04-24 16:41:44 +0000 UTC" firstStartedPulling="2026-04-24 16:41:45.447096246 +0000 UTC m=+163.711811622" lastFinishedPulling="2026-04-24 16:41:46.104460173 +0000 UTC m=+164.369175559" observedRunningTime="2026-04-24 16:41:47.700592904 +0000 UTC m=+165.965308302" watchObservedRunningTime="2026-04-24 16:41:47.701566632 +0000 UTC m=+165.966282030" Apr 24 16:41:47.723727 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:47.723681 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pzh2f" podStartSLOduration=2.837230293 podStartE2EDuration="3.723667686s" podCreationTimestamp="2026-04-24 16:41:44 +0000 UTC" firstStartedPulling="2026-04-24 16:41:45.918497771 +0000 UTC m=+164.183213147" lastFinishedPulling="2026-04-24 16:41:46.804935121 +0000 UTC m=+165.069650540" observedRunningTime="2026-04-24 16:41:47.722773416 +0000 UTC m=+165.987488811" watchObservedRunningTime="2026-04-24 16:41:47.723667686 +0000 UTC m=+165.988383086" Apr 24 16:41:49.285707 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.285668 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fd685ffdd-zrbm5"] Apr 24 16:41:49.287631 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.287606 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.291953 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.291928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-2v7t4\"" Apr 24 16:41:49.292068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.291929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 16:41:49.292068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.291929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 16:41:49.292068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.291929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 16:41:49.292238 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.292225 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 16:41:49.292297 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.292260 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-69u87qf22poon\"" Apr 24 16:41:49.297839 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.297816 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fd685ffdd-zrbm5"] Apr 24 16:41:49.383845 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.383806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-secret-metrics-server-client-certs\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.384019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.383853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-client-ca-bundle\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.384019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.383880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-secret-metrics-server-tls\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.384019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.383912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/334c37df-f60e-4d19-85bd-0aedb04d278b-metrics-server-audit-profiles\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.384019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.383939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334c37df-f60e-4d19-85bd-0aedb04d278b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.384019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.383987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkf6l\" (UniqueName: \"kubernetes.io/projected/334c37df-f60e-4d19-85bd-0aedb04d278b-kube-api-access-lkf6l\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.384175 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.384053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/334c37df-f60e-4d19-85bd-0aedb04d278b-audit-log\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.484569 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-client-ca-bundle\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.484728 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-secret-metrics-server-tls\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.484728 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/334c37df-f60e-4d19-85bd-0aedb04d278b-metrics-server-audit-profiles\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.484728 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334c37df-f60e-4d19-85bd-0aedb04d278b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.484728 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkf6l\" (UniqueName: \"kubernetes.io/projected/334c37df-f60e-4d19-85bd-0aedb04d278b-kube-api-access-lkf6l\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.485007 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/334c37df-f60e-4d19-85bd-0aedb04d278b-audit-log\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.485007 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.484779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-secret-metrics-server-client-certs\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.485367 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.485337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/334c37df-f60e-4d19-85bd-0aedb04d278b-audit-log\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.486265 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.486241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/334c37df-f60e-4d19-85bd-0aedb04d278b-metrics-server-audit-profiles\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.486265 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.486249 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/334c37df-f60e-4d19-85bd-0aedb04d278b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.487539 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.487496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-secret-metrics-server-tls\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.487645 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.487620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-secret-metrics-server-client-certs\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.487910 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.487890 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/334c37df-f60e-4d19-85bd-0aedb04d278b-client-ca-bundle\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.493848 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.493825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkf6l\" (UniqueName: \"kubernetes.io/projected/334c37df-f60e-4d19-85bd-0aedb04d278b-kube-api-access-lkf6l\") pod \"metrics-server-5fd685ffdd-zrbm5\" (UID: \"334c37df-f60e-4d19-85bd-0aedb04d278b\") " pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.597188 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.597088 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:41:49.734360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:49.734328 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fd685ffdd-zrbm5"] Apr 24 16:41:49.737520 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:49.737491 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod334c37df_f60e_4d19_85bd_0aedb04d278b.slice/crio-c8bd7e968c4c5f9b386fbc33c96225c1885afb1d3a2a79a12499753893c9c0c7 WatchSource:0}: Error finding container c8bd7e968c4c5f9b386fbc33c96225c1885afb1d3a2a79a12499753893c9c0c7: Status 404 returned error can't find the container with id c8bd7e968c4c5f9b386fbc33c96225c1885afb1d3a2a79a12499753893c9c0c7 Apr 24 16:41:50.692399 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:50.692360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" event={"ID":"334c37df-f60e-4d19-85bd-0aedb04d278b","Type":"ContainerStarted","Data":"c8bd7e968c4c5f9b386fbc33c96225c1885afb1d3a2a79a12499753893c9c0c7"} Apr 24 16:41:51.696445 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:51.696407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" event={"ID":"334c37df-f60e-4d19-85bd-0aedb04d278b","Type":"ContainerStarted","Data":"8b92ff3d11c488c7c57649f8810082048c939196f985ca6af281e0e6b75905f0"} Apr 24 16:41:51.717160 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:51.717119 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" podStartSLOduration=1.517367103 podStartE2EDuration="2.717107054s" podCreationTimestamp="2026-04-24 16:41:49 +0000 UTC" firstStartedPulling="2026-04-24 16:41:49.739721366 +0000 UTC m=+168.004436742" lastFinishedPulling="2026-04-24 16:41:50.939461316 +0000 UTC m=+169.204176693" observedRunningTime="2026-04-24 16:41:51.716965434 +0000 UTC m=+169.981680844" watchObservedRunningTime="2026-04-24 16:41:51.717107054 +0000 UTC m=+169.981822451" Apr 24 16:41:52.213228 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.213196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:41:52.570939 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.570898 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68cc75cb57-5gmlf"] Apr 24 16:41:52.573129 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.573106 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.577720 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.577695 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:41:52.577949 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.577931 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:41:52.578571 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.578551 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:41:52.578769 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.578751 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:41:52.579275 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.579223 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:41:52.579372 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.579357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:41:52.579617 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.579594 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4q5zg\"" Apr 24 16:41:52.579725 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.579638 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:41:52.584222 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.584202 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:41:52.585201 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.585183 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68cc75cb57-5gmlf"] Apr 24 16:41:52.711825 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.711798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-trusted-ca-bundle\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.712204 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.711830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-console-config\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.712204 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.711860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-oauth-serving-cert\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.712204 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.711933 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pwg\" (UniqueName: \"kubernetes.io/projected/f441b66d-3af5-4385-993e-31d7208259f0-kube-api-access-d8pwg\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.712204 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.711966 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-serving-cert\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.712204 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.711989 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-service-ca\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.712204 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.712070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-oauth-config\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.812669 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-trusted-ca-bundle\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.812669 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-console-config\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.812843 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-oauth-serving-cert\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.812843 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pwg\" (UniqueName: \"kubernetes.io/projected/f441b66d-3af5-4385-993e-31d7208259f0-kube-api-access-d8pwg\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.812843 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-serving-cert\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.812843 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-service-ca\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.813014 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.812937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-oauth-config\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.813555 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.813526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-oauth-serving-cert\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.813657 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.813567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-console-config\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.813709 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.813657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-service-ca\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.813745 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.813708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-trusted-ca-bundle\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.815592 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.815568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-oauth-config\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.815705 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.815688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-serving-cert\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.820757 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.820735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pwg\" (UniqueName: \"kubernetes.io/projected/f441b66d-3af5-4385-993e-31d7208259f0-kube-api-access-d8pwg\") pod \"console-68cc75cb57-5gmlf\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:52.881674 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:52.881614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:41:53.004961 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:53.004935 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68cc75cb57-5gmlf"] Apr 24 16:41:53.007821 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:53.007793 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf441b66d_3af5_4385_993e_31d7208259f0.slice/crio-7a9bb0ac84a289ec29c1f88f3578b9ea9d045a2aa961a3b9871d6413abaad0c4 WatchSource:0}: Error finding container 7a9bb0ac84a289ec29c1f88f3578b9ea9d045a2aa961a3b9871d6413abaad0c4: Status 404 returned error can't find the container with id 7a9bb0ac84a289ec29c1f88f3578b9ea9d045a2aa961a3b9871d6413abaad0c4 Apr 24 16:41:53.703551 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:53.703500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68cc75cb57-5gmlf" event={"ID":"f441b66d-3af5-4385-993e-31d7208259f0","Type":"ContainerStarted","Data":"7a9bb0ac84a289ec29c1f88f3578b9ea9d045a2aa961a3b9871d6413abaad0c4"} Apr 24 16:41:54.212746 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:54.212709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:41:54.215438 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:54.215406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lhhc7\"" Apr 24 16:41:54.223211 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:54.223179 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6b27c" Apr 24 16:41:54.357540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:54.357511 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6b27c"] Apr 24 16:41:54.362362 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:41:54.362330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae78d18_eee9_4ff6_b5b1_81a6bd62493c.slice/crio-9792f71fbf3702fbe4706ac604e62c5aaeb1a47d8024deab0256277fc3eeb013 WatchSource:0}: Error finding container 9792f71fbf3702fbe4706ac604e62c5aaeb1a47d8024deab0256277fc3eeb013: Status 404 returned error can't find the container with id 9792f71fbf3702fbe4706ac604e62c5aaeb1a47d8024deab0256277fc3eeb013 Apr 24 16:41:54.707283 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:54.707246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6b27c" event={"ID":"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c","Type":"ContainerStarted","Data":"9792f71fbf3702fbe4706ac604e62c5aaeb1a47d8024deab0256277fc3eeb013"} Apr 24 16:41:56.682565 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:56.682532 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f6b8n" Apr 24 16:41:56.714501 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:56.714448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68cc75cb57-5gmlf" event={"ID":"f441b66d-3af5-4385-993e-31d7208259f0","Type":"ContainerStarted","Data":"aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07"} Apr 24 16:41:56.773094 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:56.773045 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68cc75cb57-5gmlf" podStartSLOduration=2.10500942 podStartE2EDuration="4.773025873s" podCreationTimestamp="2026-04-24 16:41:52 +0000 UTC" firstStartedPulling="2026-04-24 16:41:53.009778133 +0000 UTC m=+171.274493510" lastFinishedPulling="2026-04-24 16:41:55.677794587 +0000 UTC m=+173.942509963" observedRunningTime="2026-04-24 16:41:56.77176263 +0000 UTC m=+175.036478105" watchObservedRunningTime="2026-04-24 16:41:56.773025873 +0000 UTC m=+175.037741270" Apr 24 16:41:57.718783 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:57.718745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6b27c" event={"ID":"8ae78d18-eee9-4ff6-b5b1-81a6bd62493c","Type":"ContainerStarted","Data":"0414d1491b37bdedf4d06498cd47a34fd9cf26ee37af0a29bed2419518b5c27a"} Apr 24 16:41:57.736561 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:41:57.736516 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6b27c" podStartSLOduration=139.162679318 podStartE2EDuration="2m21.736500532s" podCreationTimestamp="2026-04-24 16:39:36 +0000 UTC" firstStartedPulling="2026-04-24 16:41:54.364682824 +0000 UTC m=+172.629398202" lastFinishedPulling="2026-04-24 16:41:56.938504039 +0000 UTC m=+175.203219416" observedRunningTime="2026-04-24 16:41:57.735450591 +0000 UTC m=+176.000165989" watchObservedRunningTime="2026-04-24 16:41:57.736500532 +0000 UTC m=+176.001215929" Apr 24 16:42:00.655736 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:00.655707 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54d688d7d7-nlm52" Apr 24 16:42:02.882096 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:02.882060 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:42:02.882096 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:02.882102 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:42:02.886876 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:02.886852 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:42:03.738712 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:03.738683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:42:09.597424 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:09.597381 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:42:09.597424 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:09.597429 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:42:29.604389 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:29.604352 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:42:29.609354 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:29.609324 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fd685ffdd-zrbm5" Apr 24 16:42:44.450687 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:44.450627 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" podUID="1486bd60-1c32-4e9b-a771-ae9a78a6a370" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:42:54.451206 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:42:54.451165 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" podUID="1486bd60-1c32-4e9b-a771-ae9a78a6a370" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:43:04.451180 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.451139 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" podUID="1486bd60-1c32-4e9b-a771-ae9a78a6a370" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:43:04.451619 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.451221 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" Apr 24 16:43:04.451712 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.451683 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"27e12c0be53812b91e0525138b914f95100922458886862194854fe367325c6e"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 16:43:04.451761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.451746 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" podUID="1486bd60-1c32-4e9b-a771-ae9a78a6a370" containerName="service-proxy" containerID="cri-o://27e12c0be53812b91e0525138b914f95100922458886862194854fe367325c6e" gracePeriod=30 Apr 24 16:43:04.902258 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.902222 2573 generic.go:358] "Generic (PLEG): container finished" podID="1486bd60-1c32-4e9b-a771-ae9a78a6a370" containerID="27e12c0be53812b91e0525138b914f95100922458886862194854fe367325c6e" exitCode=2 Apr 24 16:43:04.902445 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.902271 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" event={"ID":"1486bd60-1c32-4e9b-a771-ae9a78a6a370","Type":"ContainerDied","Data":"27e12c0be53812b91e0525138b914f95100922458886862194854fe367325c6e"} Apr 24 16:43:04.902445 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:04.902338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86d5c5cb54-xhkzl" event={"ID":"1486bd60-1c32-4e9b-a771-ae9a78a6a370","Type":"ContainerStarted","Data":"c14eb061f4c5cfa1d0705d78fe830aeef83473bf4751c9c91d1f1523aedb748b"} Apr 24 16:43:14.058146 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:14.058103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:43:14.060543 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:14.060522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3052a162-5d36-4309-9bd0-bca01410b715-metrics-certs\") pod \"network-metrics-daemon-74mjh\" (UID: \"3052a162-5d36-4309-9bd0-bca01410b715\") " pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:43:14.115936 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:14.115908 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zln8m\"" Apr 24 16:43:14.123728 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:14.123707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-74mjh" Apr 24 16:43:14.243725 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:14.243699 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-74mjh"] Apr 24 16:43:14.246719 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:43:14.246693 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3052a162_5d36_4309_9bd0_bca01410b715.slice/crio-f553971b710d501cf3b363b617ed5d43048d8401d49ddab25bc7cc6387633742 WatchSource:0}: Error finding container f553971b710d501cf3b363b617ed5d43048d8401d49ddab25bc7cc6387633742: Status 404 returned error can't find the container with id f553971b710d501cf3b363b617ed5d43048d8401d49ddab25bc7cc6387633742 Apr 24 16:43:14.929515 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:14.929475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-74mjh" event={"ID":"3052a162-5d36-4309-9bd0-bca01410b715","Type":"ContainerStarted","Data":"f553971b710d501cf3b363b617ed5d43048d8401d49ddab25bc7cc6387633742"} Apr 24 16:43:15.933260 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:15.933227 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-74mjh" event={"ID":"3052a162-5d36-4309-9bd0-bca01410b715","Type":"ContainerStarted","Data":"b1aac228b63dbff0b1a161dbed6c06c487cd26895bda60d75a81c11be6c34a17"} Apr 24 16:43:15.933260 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:15.933262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-74mjh" event={"ID":"3052a162-5d36-4309-9bd0-bca01410b715","Type":"ContainerStarted","Data":"c253d67d1f7e42451df79af8e4d06e3b95ab485d008a06fa178b962e4784395c"} Apr 24 16:43:15.952047 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:15.951985 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-74mjh" podStartSLOduration=252.909427674 podStartE2EDuration="4m13.951970516s" podCreationTimestamp="2026-04-24 16:39:02 +0000 UTC" firstStartedPulling="2026-04-24 16:43:14.248421623 +0000 UTC m=+252.513137001" lastFinishedPulling="2026-04-24 16:43:15.290964467 +0000 UTC m=+253.555679843" observedRunningTime="2026-04-24 16:43:15.950881463 +0000 UTC m=+254.215596860" watchObservedRunningTime="2026-04-24 16:43:15.951970516 +0000 UTC m=+254.216685913" Apr 24 16:43:24.262524 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:24.262446 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68cc75cb57-5gmlf"] Apr 24 16:43:49.281494 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.281429 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68cc75cb57-5gmlf" podUID="f441b66d-3af5-4385-993e-31d7208259f0" containerName="console" containerID="cri-o://aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07" gracePeriod=15 Apr 24 16:43:49.521224 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.521202 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68cc75cb57-5gmlf_f441b66d-3af5-4385-993e-31d7208259f0/console/0.log" Apr 24 16:43:49.521373 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.521274 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:43:49.615192 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615112 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-trusted-ca-bundle\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615192 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615158 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-service-ca\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615192 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615179 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-console-config\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615492 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615195 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8pwg\" (UniqueName: \"kubernetes.io/projected/f441b66d-3af5-4385-993e-31d7208259f0-kube-api-access-d8pwg\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615492 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615367 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-oauth-serving-cert\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615492 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615429 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-serving-cert\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615492 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615476 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-oauth-config\") pod \"f441b66d-3af5-4385-993e-31d7208259f0\" (UID: \"f441b66d-3af5-4385-993e-31d7208259f0\") " Apr 24 16:43:49.615689 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615631 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:49.615741 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615660 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-service-ca" (OuterVolumeSpecName: "service-ca") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:49.615741 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615726 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:49.615825 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615731 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-console-config" (OuterVolumeSpecName: "console-config") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:49.615825 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.615748 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-trusted-ca-bundle\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:49.617762 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.617724 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:49.617762 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.617738 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f441b66d-3af5-4385-993e-31d7208259f0-kube-api-access-d8pwg" (OuterVolumeSpecName: "kube-api-access-d8pwg") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "kube-api-access-d8pwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:43:49.617902 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.617806 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f441b66d-3af5-4385-993e-31d7208259f0" (UID: "f441b66d-3af5-4385-993e-31d7208259f0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:49.717003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.716964 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-serving-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:49.717003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.716992 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f441b66d-3af5-4385-993e-31d7208259f0-console-oauth-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:49.717003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.717003 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-service-ca\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:49.717003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.717013 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-console-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:49.717331 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.717022 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d8pwg\" (UniqueName: \"kubernetes.io/projected/f441b66d-3af5-4385-993e-31d7208259f0-kube-api-access-d8pwg\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:49.717331 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:49.717031 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f441b66d-3af5-4385-993e-31d7208259f0-oauth-serving-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:43:50.023499 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.023473 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68cc75cb57-5gmlf_f441b66d-3af5-4385-993e-31d7208259f0/console/0.log" Apr 24 16:43:50.023686 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.023515 2573 generic.go:358] "Generic (PLEG): container finished" podID="f441b66d-3af5-4385-993e-31d7208259f0" containerID="aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07" exitCode=2 Apr 24 16:43:50.023686 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.023543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68cc75cb57-5gmlf" event={"ID":"f441b66d-3af5-4385-993e-31d7208259f0","Type":"ContainerDied","Data":"aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07"} Apr 24 16:43:50.023686 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.023583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68cc75cb57-5gmlf" event={"ID":"f441b66d-3af5-4385-993e-31d7208259f0","Type":"ContainerDied","Data":"7a9bb0ac84a289ec29c1f88f3578b9ea9d045a2aa961a3b9871d6413abaad0c4"} Apr 24 16:43:50.023686 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.023593 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68cc75cb57-5gmlf" Apr 24 16:43:50.023887 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.023598 2573 scope.go:117] "RemoveContainer" containerID="aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07" Apr 24 16:43:50.032464 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.032448 2573 scope.go:117] "RemoveContainer" containerID="aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07" Apr 24 16:43:50.032720 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:43:50.032701 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07\": container with ID starting with aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07 not found: ID does not exist" containerID="aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07" Apr 24 16:43:50.032767 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.032728 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07"} err="failed to get container status \"aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07\": rpc error: code = NotFound desc = could not find container \"aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07\": container with ID starting with aee3923054816cf25e94ef905c6145857180ef207885a86755a15117026bbf07 not found: ID does not exist" Apr 24 16:43:50.050481 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.050452 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68cc75cb57-5gmlf"] Apr 24 16:43:50.070777 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.070747 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68cc75cb57-5gmlf"] Apr 24 16:43:50.215731 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:43:50.215697 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f441b66d-3af5-4385-993e-31d7208259f0" path="/var/lib/kubelet/pods/f441b66d-3af5-4385-993e-31d7208259f0/volumes" Apr 24 16:44:02.106393 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:44:02.106364 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:44:02.113187 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:44:02.113141 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:02.114130 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:44:02.114112 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:45:00.493747 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.493663 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cmffg"] Apr 24 16:45:00.494162 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.493895 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f441b66d-3af5-4385-993e-31d7208259f0" containerName="console" Apr 24 16:45:00.494162 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.493906 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f441b66d-3af5-4385-993e-31d7208259f0" containerName="console" Apr 24 16:45:00.494162 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.493957 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f441b66d-3af5-4385-993e-31d7208259f0" containerName="console" Apr 24 16:45:00.496661 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.496644 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.498761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.498734 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:45:00.503750 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.503724 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cmffg"] Apr 24 16:45:00.539512 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.539480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-kubelet-config\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.539512 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.539525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-original-pull-secret\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.539733 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.539550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-dbus\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.639976 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.639942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-kubelet-config\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.640119 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.639988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-original-pull-secret\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.640119 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.640008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-dbus\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.640119 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.640070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-kubelet-config\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.640273 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.640137 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-dbus\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.642461 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.642438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/030a15c9-43fa-4bf6-a710-a4f8b1f3c7de-original-pull-secret\") pod \"global-pull-secret-syncer-cmffg\" (UID: \"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de\") " pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.806896 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.806810 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cmffg" Apr 24 16:45:00.919739 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.919669 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cmffg"] Apr 24 16:45:00.926903 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:00.926880 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:45:01.214599 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:01.214512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cmffg" event={"ID":"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de","Type":"ContainerStarted","Data":"db98d13ad7571670b7a78ef5e8311442f9296a74fe6d379a8769a6a969fde392"} Apr 24 16:45:05.226552 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:05.226501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cmffg" event={"ID":"030a15c9-43fa-4bf6-a710-a4f8b1f3c7de","Type":"ContainerStarted","Data":"c6159a58cb470b7597f89ee4cc58b5a16479d7e6d26a5a2af5b09321f6289fd4"} Apr 24 16:45:05.242912 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:45:05.242860 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cmffg" podStartSLOduration=1.565501249 podStartE2EDuration="5.242847237s" podCreationTimestamp="2026-04-24 16:45:00 +0000 UTC" firstStartedPulling="2026-04-24 16:45:00.92700995 +0000 UTC m=+359.191725325" lastFinishedPulling="2026-04-24 16:45:04.604355933 +0000 UTC m=+362.869071313" observedRunningTime="2026-04-24 16:45:05.241600623 +0000 UTC m=+363.506316023" watchObservedRunningTime="2026-04-24 16:45:05.242847237 +0000 UTC m=+363.507562694" Apr 24 16:47:50.488121 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.487995 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-kw295"] Apr 24 16:47:50.490068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.490044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.492354 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.492330 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z9xpl\"" Apr 24 16:47:50.492516 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.492494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:47:50.493091 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.493073 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:47:50.493179 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.493076 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 16:47:50.501770 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.501744 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-kw295"] Apr 24 16:47:50.567707 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.567663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgw7t\" (UniqueName: \"kubernetes.io/projected/086299da-19b1-4a79-9d64-909f24944518-kube-api-access-dgw7t\") pod \"seaweedfs-86cc847c5c-kw295\" (UID: \"086299da-19b1-4a79-9d64-909f24944518\") " pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.567903 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.567795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/086299da-19b1-4a79-9d64-909f24944518-data\") pod \"seaweedfs-86cc847c5c-kw295\" (UID: \"086299da-19b1-4a79-9d64-909f24944518\") " pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.673517 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.673476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgw7t\" (UniqueName: \"kubernetes.io/projected/086299da-19b1-4a79-9d64-909f24944518-kube-api-access-dgw7t\") pod \"seaweedfs-86cc847c5c-kw295\" (UID: \"086299da-19b1-4a79-9d64-909f24944518\") " pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.673696 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.673565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/086299da-19b1-4a79-9d64-909f24944518-data\") pod \"seaweedfs-86cc847c5c-kw295\" (UID: \"086299da-19b1-4a79-9d64-909f24944518\") " pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.673927 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.673912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/086299da-19b1-4a79-9d64-909f24944518-data\") pod \"seaweedfs-86cc847c5c-kw295\" (UID: \"086299da-19b1-4a79-9d64-909f24944518\") " pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.685913 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.685885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgw7t\" (UniqueName: \"kubernetes.io/projected/086299da-19b1-4a79-9d64-909f24944518-kube-api-access-dgw7t\") pod \"seaweedfs-86cc847c5c-kw295\" (UID: \"086299da-19b1-4a79-9d64-909f24944518\") " pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.800102 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.800054 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:50.921836 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:50.921799 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-kw295"] Apr 24 16:47:50.926149 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:47:50.926120 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086299da_19b1_4a79_9d64_909f24944518.slice/crio-72f4fcf4ab74a6c85e32fe147b1b0baad26b8c2d354eb4623b0d2585c3190575 WatchSource:0}: Error finding container 72f4fcf4ab74a6c85e32fe147b1b0baad26b8c2d354eb4623b0d2585c3190575: Status 404 returned error can't find the container with id 72f4fcf4ab74a6c85e32fe147b1b0baad26b8c2d354eb4623b0d2585c3190575 Apr 24 16:47:51.656813 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:51.656774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-kw295" event={"ID":"086299da-19b1-4a79-9d64-909f24944518","Type":"ContainerStarted","Data":"72f4fcf4ab74a6c85e32fe147b1b0baad26b8c2d354eb4623b0d2585c3190575"} Apr 24 16:47:53.667280 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:53.667205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-kw295" event={"ID":"086299da-19b1-4a79-9d64-909f24944518","Type":"ContainerStarted","Data":"9cbf71e4f3b3caca0c11c53565059778402f6155ebcc3e4a200c269dd78d53df"} Apr 24 16:47:53.667648 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:53.667339 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:47:53.683761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:53.683712 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-kw295" podStartSLOduration=1.251760811 podStartE2EDuration="3.683696009s" podCreationTimestamp="2026-04-24 16:47:50 +0000 UTC" firstStartedPulling="2026-04-24 16:47:50.927911078 +0000 UTC m=+529.192626460" lastFinishedPulling="2026-04-24 16:47:53.359846282 +0000 UTC m=+531.624561658" observedRunningTime="2026-04-24 16:47:53.682895522 +0000 UTC m=+531.947610920" watchObservedRunningTime="2026-04-24 16:47:53.683696009 +0000 UTC m=+531.948411406" Apr 24 16:47:59.672712 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:47:59.672675 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-kw295" Apr 24 16:48:24.587201 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.587157 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66c67549db-42lpr"] Apr 24 16:48:24.589825 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.589806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.592226 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.592202 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4q5zg\"" Apr 24 16:48:24.593042 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.593023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:48:24.593140 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.593120 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:48:24.593181 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.593154 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:48:24.593441 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.593425 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:48:24.593672 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.593655 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:48:24.593993 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.593977 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:48:24.594041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.594026 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:48:24.598784 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.598762 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:48:24.604352 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.604327 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66c67549db-42lpr"] Apr 24 16:48:24.625148 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-service-ca\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.625336 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-serving-cert\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.625336 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-oauth-serving-cert\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.625336 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-trusted-ca-bundle\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.625336 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625301 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qlmw\" (UniqueName: \"kubernetes.io/projected/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-kube-api-access-5qlmw\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.625490 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-config\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.625490 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.625401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-oauth-config\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726287 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-service-ca\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726287 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-serving-cert\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726580 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-oauth-serving-cert\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726580 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-trusted-ca-bundle\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726580 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qlmw\" (UniqueName: \"kubernetes.io/projected/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-kube-api-access-5qlmw\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726580 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-config\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.726580 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.726555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-oauth-config\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.727077 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.727046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-oauth-serving-cert\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.727376 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.727045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-service-ca\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.727376 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.727181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-config\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.727500 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.727464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-trusted-ca-bundle\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.729039 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.729010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-oauth-config\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.729184 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.729163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-console-serving-cert\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.749892 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.749863 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qlmw\" (UniqueName: \"kubernetes.io/projected/0ee21b4c-e55e-4722-921c-f8e15ca79c5b-kube-api-access-5qlmw\") pod \"console-66c67549db-42lpr\" (UID: \"0ee21b4c-e55e-4722-921c-f8e15ca79c5b\") " pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:24.899964 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:24.899858 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:25.038616 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:25.038581 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66c67549db-42lpr"] Apr 24 16:48:25.042215 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:48:25.042175 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee21b4c_e55e_4722_921c_f8e15ca79c5b.slice/crio-9ac84fc06271334057980662bcb2deabdb2d8e010336e0f2781e125fe0fa6019 WatchSource:0}: Error finding container 9ac84fc06271334057980662bcb2deabdb2d8e010336e0f2781e125fe0fa6019: Status 404 returned error can't find the container with id 9ac84fc06271334057980662bcb2deabdb2d8e010336e0f2781e125fe0fa6019 Apr 24 16:48:25.753969 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:25.753934 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c67549db-42lpr" event={"ID":"0ee21b4c-e55e-4722-921c-f8e15ca79c5b","Type":"ContainerStarted","Data":"d74b34bf3f947e5b542d071fd145842b2e6d6ee927f5a1c9a4724b513bb6de4f"} Apr 24 16:48:25.753969 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:25.753971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66c67549db-42lpr" event={"ID":"0ee21b4c-e55e-4722-921c-f8e15ca79c5b","Type":"ContainerStarted","Data":"9ac84fc06271334057980662bcb2deabdb2d8e010336e0f2781e125fe0fa6019"} Apr 24 16:48:25.777537 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:25.777489 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66c67549db-42lpr" podStartSLOduration=1.7774724819999999 podStartE2EDuration="1.777472482s" podCreationTimestamp="2026-04-24 16:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:48:25.77607766 +0000 UTC m=+564.040793058" watchObservedRunningTime="2026-04-24 16:48:25.777472482 +0000 UTC m=+564.042187879" Apr 24 16:48:34.900380 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:34.900293 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:34.900380 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:34.900391 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:34.905290 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:34.905264 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:48:35.795162 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:48:35.795133 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66c67549db-42lpr" Apr 24 16:49:01.154779 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.154737 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-ljksp"] Apr 24 16:49:01.158294 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.158255 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-8697t"] Apr 24 16:49:01.158464 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.158444 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.162865 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.162836 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-lfdnj\"" Apr 24 16:49:01.163270 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.162887 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 16:49:01.163877 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.163849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.170727 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.170702 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 16:49:01.170888 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.170867 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-9t2kx\"" Apr 24 16:49:01.179556 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.179530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-ljksp"] Apr 24 16:49:01.192217 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.192185 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8697t"] Apr 24 16:49:01.315875 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.315832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a74b032-5164-4e07-a38f-776ab1d0eaf7-tls-certs\") pod \"model-serving-api-86f7b4b499-ljksp\" (UID: \"1a74b032-5164-4e07-a38f-776ab1d0eaf7\") " pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.315875 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.315877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5mhk\" (UniqueName: \"kubernetes.io/projected/1a74b032-5164-4e07-a38f-776ab1d0eaf7-kube-api-access-g5mhk\") pod \"model-serving-api-86f7b4b499-ljksp\" (UID: \"1a74b032-5164-4e07-a38f-776ab1d0eaf7\") " pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.316102 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.315906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r44m\" (UniqueName: \"kubernetes.io/projected/a88daa8b-b0c0-4e87-8d60-7ffed570349b-kube-api-access-5r44m\") pod \"odh-model-controller-696fc77849-8697t\" (UID: \"a88daa8b-b0c0-4e87-8d60-7ffed570349b\") " pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.316102 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.315968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a88daa8b-b0c0-4e87-8d60-7ffed570349b-cert\") pod \"odh-model-controller-696fc77849-8697t\" (UID: \"a88daa8b-b0c0-4e87-8d60-7ffed570349b\") " pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.416849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.416756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a74b032-5164-4e07-a38f-776ab1d0eaf7-tls-certs\") pod \"model-serving-api-86f7b4b499-ljksp\" (UID: \"1a74b032-5164-4e07-a38f-776ab1d0eaf7\") " pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.416849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.416795 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5mhk\" (UniqueName: \"kubernetes.io/projected/1a74b032-5164-4e07-a38f-776ab1d0eaf7-kube-api-access-g5mhk\") pod \"model-serving-api-86f7b4b499-ljksp\" (UID: \"1a74b032-5164-4e07-a38f-776ab1d0eaf7\") " pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.416849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.416818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r44m\" (UniqueName: \"kubernetes.io/projected/a88daa8b-b0c0-4e87-8d60-7ffed570349b-kube-api-access-5r44m\") pod \"odh-model-controller-696fc77849-8697t\" (UID: \"a88daa8b-b0c0-4e87-8d60-7ffed570349b\") " pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.417072 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.416993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a88daa8b-b0c0-4e87-8d60-7ffed570349b-cert\") pod \"odh-model-controller-696fc77849-8697t\" (UID: \"a88daa8b-b0c0-4e87-8d60-7ffed570349b\") " pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.419475 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.419451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a88daa8b-b0c0-4e87-8d60-7ffed570349b-cert\") pod \"odh-model-controller-696fc77849-8697t\" (UID: \"a88daa8b-b0c0-4e87-8d60-7ffed570349b\") " pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.419598 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.419504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1a74b032-5164-4e07-a38f-776ab1d0eaf7-tls-certs\") pod \"model-serving-api-86f7b4b499-ljksp\" (UID: \"1a74b032-5164-4e07-a38f-776ab1d0eaf7\") " pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.427693 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.427650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5mhk\" (UniqueName: \"kubernetes.io/projected/1a74b032-5164-4e07-a38f-776ab1d0eaf7-kube-api-access-g5mhk\") pod \"model-serving-api-86f7b4b499-ljksp\" (UID: \"1a74b032-5164-4e07-a38f-776ab1d0eaf7\") " pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.427893 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.427869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r44m\" (UniqueName: \"kubernetes.io/projected/a88daa8b-b0c0-4e87-8d60-7ffed570349b-kube-api-access-5r44m\") pod \"odh-model-controller-696fc77849-8697t\" (UID: \"a88daa8b-b0c0-4e87-8d60-7ffed570349b\") " pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.472832 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.472793 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:01.479793 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.479755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:01.624433 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.624397 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-ljksp"] Apr 24 16:49:01.628833 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:49:01.628804 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a74b032_5164_4e07_a38f_776ab1d0eaf7.slice/crio-c8cac6d717a2d0dc4051c571898bf59ac4c7fd01abbb9717667e956e1e3a7988 WatchSource:0}: Error finding container c8cac6d717a2d0dc4051c571898bf59ac4c7fd01abbb9717667e956e1e3a7988: Status 404 returned error can't find the container with id c8cac6d717a2d0dc4051c571898bf59ac4c7fd01abbb9717667e956e1e3a7988 Apr 24 16:49:01.632935 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.632913 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-8697t"] Apr 24 16:49:01.636297 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:49:01.636261 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88daa8b_b0c0_4e87_8d60_7ffed570349b.slice/crio-708e6e86e5a866a94c1fc6b066ab1d2f27afe3689a4e26b8e649eb9bf5341794 WatchSource:0}: Error finding container 708e6e86e5a866a94c1fc6b066ab1d2f27afe3689a4e26b8e649eb9bf5341794: Status 404 returned error can't find the container with id 708e6e86e5a866a94c1fc6b066ab1d2f27afe3689a4e26b8e649eb9bf5341794 Apr 24 16:49:01.861908 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.861868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8697t" event={"ID":"a88daa8b-b0c0-4e87-8d60-7ffed570349b","Type":"ContainerStarted","Data":"708e6e86e5a866a94c1fc6b066ab1d2f27afe3689a4e26b8e649eb9bf5341794"} Apr 24 16:49:01.862830 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:01.862805 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-ljksp" event={"ID":"1a74b032-5164-4e07-a38f-776ab1d0eaf7","Type":"ContainerStarted","Data":"c8cac6d717a2d0dc4051c571898bf59ac4c7fd01abbb9717667e956e1e3a7988"} Apr 24 16:49:02.129532 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:02.129452 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:49:02.133697 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:02.133663 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:49:05.882214 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:05.882162 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-8697t" event={"ID":"a88daa8b-b0c0-4e87-8d60-7ffed570349b","Type":"ContainerStarted","Data":"42bb82d0dcef9781aa26d1a4fd100b52962906b4624988b400a4ac5272042a85"} Apr 24 16:49:05.882689 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:05.882281 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:05.883507 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:05.883485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-ljksp" event={"ID":"1a74b032-5164-4e07-a38f-776ab1d0eaf7","Type":"ContainerStarted","Data":"0dd99feff4c7f7dc227fb8f80c21e35031e8cb0da894f42062c6d77659a45e32"} Apr 24 16:49:05.883665 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:05.883648 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:05.900174 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:05.900115 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-8697t" podStartSLOduration=1.184561505 podStartE2EDuration="4.900100043s" podCreationTimestamp="2026-04-24 16:49:01 +0000 UTC" firstStartedPulling="2026-04-24 16:49:01.637551845 +0000 UTC m=+599.902267223" lastFinishedPulling="2026-04-24 16:49:05.353090384 +0000 UTC m=+603.617805761" observedRunningTime="2026-04-24 16:49:05.899250679 +0000 UTC m=+604.163966078" watchObservedRunningTime="2026-04-24 16:49:05.900100043 +0000 UTC m=+604.164815442" Apr 24 16:49:05.916346 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:05.916271 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-ljksp" podStartSLOduration=1.198167297 podStartE2EDuration="4.916255205s" podCreationTimestamp="2026-04-24 16:49:01 +0000 UTC" firstStartedPulling="2026-04-24 16:49:01.630569999 +0000 UTC m=+599.895285375" lastFinishedPulling="2026-04-24 16:49:05.348657907 +0000 UTC m=+603.613373283" observedRunningTime="2026-04-24 16:49:05.914839103 +0000 UTC m=+604.179554502" watchObservedRunningTime="2026-04-24 16:49:05.916255205 +0000 UTC m=+604.180970602" Apr 24 16:49:16.889208 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:16.889173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-8697t" Apr 24 16:49:16.890975 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:16.890956 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-ljksp" Apr 24 16:49:28.657345 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.657255 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4"] Apr 24 16:49:28.709621 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.709578 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4"] Apr 24 16:49:28.709778 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.709704 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:28.712356 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.712332 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 16:49:28.738579 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.738552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh8l5\" (UniqueName: \"kubernetes.io/projected/98ff5ee2-3bd0-484f-b676-15a50ce3df74-kube-api-access-bh8l5\") pod \"seaweedfs-tls-custom-ddd4dbfd-dgmn4\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:28.738748 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.738587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ff5ee2-3bd0-484f-b676-15a50ce3df74-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-dgmn4\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:28.839948 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.839908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh8l5\" (UniqueName: \"kubernetes.io/projected/98ff5ee2-3bd0-484f-b676-15a50ce3df74-kube-api-access-bh8l5\") pod \"seaweedfs-tls-custom-ddd4dbfd-dgmn4\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:28.839948 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.839950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ff5ee2-3bd0-484f-b676-15a50ce3df74-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-dgmn4\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:28.840353 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.840337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ff5ee2-3bd0-484f-b676-15a50ce3df74-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-dgmn4\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:28.848537 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:28.848512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh8l5\" (UniqueName: \"kubernetes.io/projected/98ff5ee2-3bd0-484f-b676-15a50ce3df74-kube-api-access-bh8l5\") pod \"seaweedfs-tls-custom-ddd4dbfd-dgmn4\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:29.019097 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:29.019061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:49:29.139061 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:29.139029 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4"] Apr 24 16:49:29.142096 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:49:29.142057 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98ff5ee2_3bd0_484f_b676_15a50ce3df74.slice/crio-8c69382e557c1c85b0ab4f4322e47e1e81eca8d7670154203b36055c9a1923a1 WatchSource:0}: Error finding container 8c69382e557c1c85b0ab4f4322e47e1e81eca8d7670154203b36055c9a1923a1: Status 404 returned error can't find the container with id 8c69382e557c1c85b0ab4f4322e47e1e81eca8d7670154203b36055c9a1923a1 Apr 24 16:49:29.956732 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:29.956694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" event={"ID":"98ff5ee2-3bd0-484f-b676-15a50ce3df74","Type":"ContainerStarted","Data":"8c2842989b8e2979c1fe12624df727549c7f0bfc5cb88830d0ca3ada36fac189"} Apr 24 16:49:29.956732 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:29.956734 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" event={"ID":"98ff5ee2-3bd0-484f-b676-15a50ce3df74","Type":"ContainerStarted","Data":"8c69382e557c1c85b0ab4f4322e47e1e81eca8d7670154203b36055c9a1923a1"} Apr 24 16:49:29.971550 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:29.971500 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" podStartSLOduration=1.738852832 podStartE2EDuration="1.971486155s" podCreationTimestamp="2026-04-24 16:49:28 +0000 UTC" firstStartedPulling="2026-04-24 16:49:29.143405025 +0000 UTC m=+627.408120408" lastFinishedPulling="2026-04-24 16:49:29.376038355 +0000 UTC m=+627.640753731" observedRunningTime="2026-04-24 16:49:29.970348066 +0000 UTC m=+628.235063465" watchObservedRunningTime="2026-04-24 16:49:29.971486155 +0000 UTC m=+628.236201553" Apr 24 16:49:31.092562 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:31.092524 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4"] Apr 24 16:49:31.962326 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:49:31.962248 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" podUID="98ff5ee2-3bd0-484f-b676-15a50ce3df74" containerName="seaweedfs-tls-custom" containerID="cri-o://8c2842989b8e2979c1fe12624df727549c7f0bfc5cb88830d0ca3ada36fac189" gracePeriod=30 Apr 24 16:50:00.046059 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.046015 2573 generic.go:358] "Generic (PLEG): container finished" podID="98ff5ee2-3bd0-484f-b676-15a50ce3df74" containerID="8c2842989b8e2979c1fe12624df727549c7f0bfc5cb88830d0ca3ada36fac189" exitCode=0 Apr 24 16:50:00.046587 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.046080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" event={"ID":"98ff5ee2-3bd0-484f-b676-15a50ce3df74","Type":"ContainerDied","Data":"8c2842989b8e2979c1fe12624df727549c7f0bfc5cb88830d0ca3ada36fac189"} Apr 24 16:50:00.107149 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.107126 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:50:00.192452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.192354 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh8l5\" (UniqueName: \"kubernetes.io/projected/98ff5ee2-3bd0-484f-b676-15a50ce3df74-kube-api-access-bh8l5\") pod \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " Apr 24 16:50:00.192452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.192409 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ff5ee2-3bd0-484f-b676-15a50ce3df74-data\") pod \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\" (UID: \"98ff5ee2-3bd0-484f-b676-15a50ce3df74\") " Apr 24 16:50:00.193721 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.193684 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98ff5ee2-3bd0-484f-b676-15a50ce3df74-data" (OuterVolumeSpecName: "data") pod "98ff5ee2-3bd0-484f-b676-15a50ce3df74" (UID: "98ff5ee2-3bd0-484f-b676-15a50ce3df74"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:50:00.194781 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.194757 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ff5ee2-3bd0-484f-b676-15a50ce3df74-kube-api-access-bh8l5" (OuterVolumeSpecName: "kube-api-access-bh8l5") pod "98ff5ee2-3bd0-484f-b676-15a50ce3df74" (UID: "98ff5ee2-3bd0-484f-b676-15a50ce3df74"). InnerVolumeSpecName "kube-api-access-bh8l5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:00.293497 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.293461 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bh8l5\" (UniqueName: \"kubernetes.io/projected/98ff5ee2-3bd0-484f-b676-15a50ce3df74-kube-api-access-bh8l5\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:50:00.293497 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:00.293493 2573 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/98ff5ee2-3bd0-484f-b676-15a50ce3df74-data\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:50:01.050154 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.050112 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" event={"ID":"98ff5ee2-3bd0-484f-b676-15a50ce3df74","Type":"ContainerDied","Data":"8c69382e557c1c85b0ab4f4322e47e1e81eca8d7670154203b36055c9a1923a1"} Apr 24 16:50:01.050154 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.050147 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4" Apr 24 16:50:01.050154 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.050160 2573 scope.go:117] "RemoveContainer" containerID="8c2842989b8e2979c1fe12624df727549c7f0bfc5cb88830d0ca3ada36fac189" Apr 24 16:50:01.066518 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.066491 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4"] Apr 24 16:50:01.070240 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.070215 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-dgmn4"] Apr 24 16:50:01.103354 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.103317 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb"] Apr 24 16:50:01.103620 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.103608 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98ff5ee2-3bd0-484f-b676-15a50ce3df74" containerName="seaweedfs-tls-custom" Apr 24 16:50:01.103662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.103623 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ff5ee2-3bd0-484f-b676-15a50ce3df74" containerName="seaweedfs-tls-custom" Apr 24 16:50:01.103695 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.103683 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="98ff5ee2-3bd0-484f-b676-15a50ce3df74" containerName="seaweedfs-tls-custom" Apr 24 16:50:01.107903 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.107886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.110161 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.110135 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 16:50:01.110274 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.110169 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 16:50:01.112883 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.112863 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb"] Apr 24 16:50:01.200918 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.200878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.200918 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.200922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz88b\" (UniqueName: \"kubernetes.io/projected/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-kube-api-access-zz88b\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.201131 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.200946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.301738 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.301637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.301738 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.301688 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz88b\" (UniqueName: \"kubernetes.io/projected/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-kube-api-access-zz88b\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.301738 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.301719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.302079 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.302057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.304342 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.304325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.316299 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.316270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz88b\" (UniqueName: \"kubernetes.io/projected/9a4d53d1-2ea2-4940-8b98-2e7c9641b22a-kube-api-access-zz88b\") pod \"seaweedfs-tls-custom-5c88b85bb7-94zdb\" (UID: \"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.417869 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.417808 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" Apr 24 16:50:01.543170 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.543035 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb"] Apr 24 16:50:01.545841 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:50:01.545807 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a4d53d1_2ea2_4940_8b98_2e7c9641b22a.slice/crio-15226c0e7978c6f9122dc7091efa8c12db7f6ddcd1db84a5c5cc38bef887421e WatchSource:0}: Error finding container 15226c0e7978c6f9122dc7091efa8c12db7f6ddcd1db84a5c5cc38bef887421e: Status 404 returned error can't find the container with id 15226c0e7978c6f9122dc7091efa8c12db7f6ddcd1db84a5c5cc38bef887421e Apr 24 16:50:01.547025 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:01.547008 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:50:02.054586 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:02.054490 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" event={"ID":"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a","Type":"ContainerStarted","Data":"09e554b18413d88d32b39f08c2bfb53ba224fdbea72ec38faa48e3c8cf59b571"} Apr 24 16:50:02.054586 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:02.054540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" event={"ID":"9a4d53d1-2ea2-4940-8b98-2e7c9641b22a","Type":"ContainerStarted","Data":"15226c0e7978c6f9122dc7091efa8c12db7f6ddcd1db84a5c5cc38bef887421e"} Apr 24 16:50:02.075074 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:02.074998 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-94zdb" podStartSLOduration=0.826313375 podStartE2EDuration="1.074980226s" podCreationTimestamp="2026-04-24 16:50:01 +0000 UTC" firstStartedPulling="2026-04-24 16:50:01.547131686 +0000 UTC m=+659.811847062" lastFinishedPulling="2026-04-24 16:50:01.795798534 +0000 UTC m=+660.060513913" observedRunningTime="2026-04-24 16:50:02.072389473 +0000 UTC m=+660.337104868" watchObservedRunningTime="2026-04-24 16:50:02.074980226 +0000 UTC m=+660.339695624" Apr 24 16:50:02.215950 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:02.215915 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ff5ee2-3bd0-484f-b676-15a50ce3df74" path="/var/lib/kubelet/pods/98ff5ee2-3bd0-484f-b676-15a50ce3df74/volumes" Apr 24 16:50:30.099450 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.099413 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj"] Apr 24 16:50:30.102948 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.102928 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.106041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.106012 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 24 16:50:30.106041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.106038 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:50:30.107651 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.107632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2jnrr\"" Apr 24 16:50:30.107718 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.107672 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 24 16:50:30.107891 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.107870 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:50:30.122949 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.122913 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj"] Apr 24 16:50:30.139769 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.139735 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.139955 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.139794 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9131647b-b423-4c5a-9e80-c66ad6b211fe-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.139955 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.139844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgwb\" (UniqueName: \"kubernetes.io/projected/9131647b-b423-4c5a-9e80-c66ad6b211fe-kube-api-access-nlgwb\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.139955 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.139886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9131647b-b423-4c5a-9e80-c66ad6b211fe-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.240682 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.240640 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.240904 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.240729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9131647b-b423-4c5a-9e80-c66ad6b211fe-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.240904 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:50:30.240755 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-serving-cert: secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 24 16:50:30.240904 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.240773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgwb\" (UniqueName: \"kubernetes.io/projected/9131647b-b423-4c5a-9e80-c66ad6b211fe-kube-api-access-nlgwb\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.240904 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.240804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9131647b-b423-4c5a-9e80-c66ad6b211fe-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.240904 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:50:30.240843 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls podName:9131647b-b423-4c5a-9e80-c66ad6b211fe nodeName:}" failed. No retries permitted until 2026-04-24 16:50:30.740820158 +0000 UTC m=+689.005535533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls") pod "isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" (UID: "9131647b-b423-4c5a-9e80-c66ad6b211fe") : secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 24 16:50:30.241404 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.241379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9131647b-b423-4c5a-9e80-c66ad6b211fe-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.241538 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.241519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9131647b-b423-4c5a-9e80-c66ad6b211fe-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.258904 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.258871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgwb\" (UniqueName: \"kubernetes.io/projected/9131647b-b423-4c5a-9e80-c66ad6b211fe-kube-api-access-nlgwb\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.744556 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.744506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:30.747353 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:30.747297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-5d9978c695-sfwmj\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:31.014303 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:31.014199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:31.144659 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:31.144627 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj"] Apr 24 16:50:31.148827 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:50:31.148794 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9131647b_b423_4c5a_9e80_c66ad6b211fe.slice/crio-fb1d34cb681b483627ac1afbea5f144616b9f651e86f3b2e551407b8dfabdb67 WatchSource:0}: Error finding container fb1d34cb681b483627ac1afbea5f144616b9f651e86f3b2e551407b8dfabdb67: Status 404 returned error can't find the container with id fb1d34cb681b483627ac1afbea5f144616b9f651e86f3b2e551407b8dfabdb67 Apr 24 16:50:32.145680 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:32.145635 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerStarted","Data":"fb1d34cb681b483627ac1afbea5f144616b9f651e86f3b2e551407b8dfabdb67"} Apr 24 16:50:35.154814 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:35.154772 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerStarted","Data":"147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01"} Apr 24 16:50:39.166655 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:39.166619 2573 generic.go:358] "Generic (PLEG): container finished" podID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerID="147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01" exitCode=0 Apr 24 16:50:39.166655 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:39.166663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerDied","Data":"147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01"} Apr 24 16:50:53.212401 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:53.212366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerStarted","Data":"246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041"} Apr 24 16:50:55.221897 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:55.221857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerStarted","Data":"f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e"} Apr 24 16:50:58.239663 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:58.239630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerStarted","Data":"abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328"} Apr 24 16:50:58.240127 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:58.239837 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:58.262873 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:58.262808 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podStartSLOduration=1.449045162 podStartE2EDuration="28.262787513s" podCreationTimestamp="2026-04-24 16:50:30 +0000 UTC" firstStartedPulling="2026-04-24 16:50:31.150629732 +0000 UTC m=+689.415345126" lastFinishedPulling="2026-04-24 16:50:57.964372097 +0000 UTC m=+716.229087477" observedRunningTime="2026-04-24 16:50:58.260059129 +0000 UTC m=+716.524774527" watchObservedRunningTime="2026-04-24 16:50:58.262787513 +0000 UTC m=+716.527502912" Apr 24 16:50:59.242921 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:59.242884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:59.242921 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:59.242927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:50:59.244116 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:59.244085 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:50:59.244783 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:50:59.244759 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:00.246074 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:00.246031 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:00.246568 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:00.246454 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:00.249756 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:00.249734 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:51:01.248687 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:01.248642 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:01.249098 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:01.249007 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:11.249631 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:11.249580 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:11.250150 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:11.250090 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:21.249428 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:21.249383 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:21.249894 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:21.249837 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:31.249175 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:31.249121 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:31.249587 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:31.249548 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:41.248806 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:41.248759 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:41.249200 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:41.249171 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:51:51.248996 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:51.248942 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:51:51.249548 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:51:51.249357 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:01.250067 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:01.250038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:01.250560 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:01.250494 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:15.192252 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.192218 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj"] Apr 24 16:52:15.192703 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.192583 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" containerID="cri-o://246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041" gracePeriod=30 Apr 24 16:52:15.192703 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.192619 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" containerID="cri-o://abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328" gracePeriod=30 Apr 24 16:52:15.192703 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.192659 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" containerID="cri-o://f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e" gracePeriod=30 Apr 24 16:52:15.247068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.247020 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:15.440598 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.440560 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg"] Apr 24 16:52:15.444068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.444024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.448274 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.448251 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 24 16:52:15.448392 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.448284 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 24 16:52:15.457372 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.457348 2573 generic.go:358] "Generic (PLEG): container finished" podID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerID="f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e" exitCode=2 Apr 24 16:52:15.457466 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.457417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerDied","Data":"f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e"} Apr 24 16:52:15.466009 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.465981 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg"] Apr 24 16:52:15.621216 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.621178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb8kl\" (UniqueName: \"kubernetes.io/projected/abaaf81a-49e4-41c5-8815-abd1e5158d77-kube-api-access-nb8kl\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.621416 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.621228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf81a-49e4-41c5-8815-abd1e5158d77-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.621416 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.621255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf81a-49e4-41c5-8815-abd1e5158d77-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.621416 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.621347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf81a-49e4-41c5-8815-abd1e5158d77-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.722601 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.722522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf81a-49e4-41c5-8815-abd1e5158d77-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.722601 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.722587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nb8kl\" (UniqueName: \"kubernetes.io/projected/abaaf81a-49e4-41c5-8815-abd1e5158d77-kube-api-access-nb8kl\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.722786 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.722619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf81a-49e4-41c5-8815-abd1e5158d77-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.722786 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.722643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf81a-49e4-41c5-8815-abd1e5158d77-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.723094 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.723072 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf81a-49e4-41c5-8815-abd1e5158d77-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.723391 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.723370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf81a-49e4-41c5-8815-abd1e5158d77-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.725140 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.725119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf81a-49e4-41c5-8815-abd1e5158d77-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.731817 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.731792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb8kl\" (UniqueName: \"kubernetes.io/projected/abaaf81a-49e4-41c5-8815-abd1e5158d77-kube-api-access-nb8kl\") pod \"isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.754540 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.754514 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:15.885432 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:15.885404 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg"] Apr 24 16:52:15.887350 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:52:15.887322 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabaaf81a_49e4_41c5_8815_abd1e5158d77.slice/crio-b2977eeb7c95aee56f385ef9d1cb62900ec5dbcbcac2d408bdea3fc7703879c7 WatchSource:0}: Error finding container b2977eeb7c95aee56f385ef9d1cb62900ec5dbcbcac2d408bdea3fc7703879c7: Status 404 returned error can't find the container with id b2977eeb7c95aee56f385ef9d1cb62900ec5dbcbcac2d408bdea3fc7703879c7 Apr 24 16:52:16.461473 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:16.461431 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerStarted","Data":"819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18"} Apr 24 16:52:16.461473 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:16.461478 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerStarted","Data":"b2977eeb7c95aee56f385ef9d1cb62900ec5dbcbcac2d408bdea3fc7703879c7"} Apr 24 16:52:19.472994 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:19.472910 2573 generic.go:358] "Generic (PLEG): container finished" podID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerID="246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041" exitCode=0 Apr 24 16:52:19.472994 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:19.472967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerDied","Data":"246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041"} Apr 24 16:52:20.246332 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:20.246237 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:20.477386 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:20.477353 2573 generic.go:358] "Generic (PLEG): container finished" podID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerID="819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18" exitCode=0 Apr 24 16:52:20.477741 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:20.477423 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerDied","Data":"819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18"} Apr 24 16:52:21.249288 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.249250 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:21.251495 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.251468 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:52:21.483066 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.483034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerStarted","Data":"5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050"} Apr 24 16:52:21.483066 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.483070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerStarted","Data":"1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138"} Apr 24 16:52:21.483500 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.483081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerStarted","Data":"f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8"} Apr 24 16:52:21.483500 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.483408 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:21.483590 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.483526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:21.484696 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.484669 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:21.506158 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:21.506065 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podStartSLOduration=6.506053153 podStartE2EDuration="6.506053153s" podCreationTimestamp="2026-04-24 16:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:52:21.50383436 +0000 UTC m=+799.768549757" watchObservedRunningTime="2026-04-24 16:52:21.506053153 +0000 UTC m=+799.770768550" Apr 24 16:52:22.486761 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:22.486726 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:22.487259 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:22.486772 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:22.487815 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:22.487789 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:23.490653 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:23.490612 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:23.491122 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:23.491098 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:25.246529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:25.246487 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:25.246935 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:25.246626 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:28.495692 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:28.495658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:52:28.496363 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:28.496329 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:28.496766 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:28.496745 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:30.246743 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:30.246694 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:31.249589 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:31.249549 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:31.250748 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:31.250718 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:52:35.246385 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:35.246335 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:38.496834 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:38.496793 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:38.497228 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:38.497205 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:40.246457 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:40.246413 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:41.249373 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:41.249300 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:41.249720 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:41.249482 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:41.250557 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:41.250529 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 24 16:52:41.250700 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:41.250662 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:45.247268 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.247219 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 24 16:52:45.344270 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.344244 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:45.478907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.478811 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlgwb\" (UniqueName: \"kubernetes.io/projected/9131647b-b423-4c5a-9e80-c66ad6b211fe-kube-api-access-nlgwb\") pod \"9131647b-b423-4c5a-9e80-c66ad6b211fe\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " Apr 24 16:52:45.478907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.478877 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9131647b-b423-4c5a-9e80-c66ad6b211fe-kserve-provision-location\") pod \"9131647b-b423-4c5a-9e80-c66ad6b211fe\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " Apr 24 16:52:45.479101 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.479000 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls\") pod \"9131647b-b423-4c5a-9e80-c66ad6b211fe\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " Apr 24 16:52:45.479101 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.479032 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9131647b-b423-4c5a-9e80-c66ad6b211fe-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"9131647b-b423-4c5a-9e80-c66ad6b211fe\" (UID: \"9131647b-b423-4c5a-9e80-c66ad6b211fe\") " Apr 24 16:52:45.479395 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.479275 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9131647b-b423-4c5a-9e80-c66ad6b211fe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9131647b-b423-4c5a-9e80-c66ad6b211fe" (UID: "9131647b-b423-4c5a-9e80-c66ad6b211fe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:52:45.479526 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.479472 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9131647b-b423-4c5a-9e80-c66ad6b211fe-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "9131647b-b423-4c5a-9e80-c66ad6b211fe" (UID: "9131647b-b423-4c5a-9e80-c66ad6b211fe"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:52:45.481241 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.481209 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9131647b-b423-4c5a-9e80-c66ad6b211fe-kube-api-access-nlgwb" (OuterVolumeSpecName: "kube-api-access-nlgwb") pod "9131647b-b423-4c5a-9e80-c66ad6b211fe" (UID: "9131647b-b423-4c5a-9e80-c66ad6b211fe"). InnerVolumeSpecName "kube-api-access-nlgwb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:52:45.481354 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.481285 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9131647b-b423-4c5a-9e80-c66ad6b211fe" (UID: "9131647b-b423-4c5a-9e80-c66ad6b211fe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:52:45.553805 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.553771 2573 generic.go:358] "Generic (PLEG): container finished" podID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerID="abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328" exitCode=0 Apr 24 16:52:45.553970 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.553868 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" Apr 24 16:52:45.553970 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.553862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerDied","Data":"abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328"} Apr 24 16:52:45.554050 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.553976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj" event={"ID":"9131647b-b423-4c5a-9e80-c66ad6b211fe","Type":"ContainerDied","Data":"fb1d34cb681b483627ac1afbea5f144616b9f651e86f3b2e551407b8dfabdb67"} Apr 24 16:52:45.554050 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.553993 2573 scope.go:117] "RemoveContainer" containerID="abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328" Apr 24 16:52:45.562381 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.562345 2573 scope.go:117] "RemoveContainer" containerID="f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e" Apr 24 16:52:45.569756 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.569740 2573 scope.go:117] "RemoveContainer" containerID="246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041" Apr 24 16:52:45.576965 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.576944 2573 scope.go:117] "RemoveContainer" containerID="147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01" Apr 24 16:52:45.577217 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.577186 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj"] Apr 24 16:52:45.580193 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.580174 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9131647b-b423-4c5a-9e80-c66ad6b211fe-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:52:45.580272 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.580195 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9131647b-b423-4c5a-9e80-c66ad6b211fe-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:52:45.580272 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.580207 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlgwb\" (UniqueName: \"kubernetes.io/projected/9131647b-b423-4c5a-9e80-c66ad6b211fe-kube-api-access-nlgwb\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:52:45.580272 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.580218 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9131647b-b423-4c5a-9e80-c66ad6b211fe-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:52:45.584256 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.584233 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5d9978c695-sfwmj"] Apr 24 16:52:45.584836 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.584807 2573 scope.go:117] "RemoveContainer" containerID="abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328" Apr 24 16:52:45.585098 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:52:45.585081 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328\": container with ID starting with abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328 not found: ID does not exist" containerID="abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328" Apr 24 16:52:45.585158 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585108 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328"} err="failed to get container status \"abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328\": rpc error: code = NotFound desc = could not find container \"abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328\": container with ID starting with abc1691678f32ee2f02bd7203d04ca8da3ebd4ff42905e3c02bda8b7c17ac328 not found: ID does not exist" Apr 24 16:52:45.585158 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585126 2573 scope.go:117] "RemoveContainer" containerID="f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e" Apr 24 16:52:45.585379 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:52:45.585359 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e\": container with ID starting with f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e not found: ID does not exist" containerID="f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e" Apr 24 16:52:45.585452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585387 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e"} err="failed to get container status \"f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e\": rpc error: code = NotFound desc = could not find container \"f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e\": container with ID starting with f8e4c5e100b373f6e858edda03322f201b88cdf6532d4e401a9618a0eb35bb3e not found: ID does not exist" Apr 24 16:52:45.585452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585410 2573 scope.go:117] "RemoveContainer" containerID="246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041" Apr 24 16:52:45.585642 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:52:45.585625 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041\": container with ID starting with 246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041 not found: ID does not exist" containerID="246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041" Apr 24 16:52:45.585681 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585646 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041"} err="failed to get container status \"246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041\": rpc error: code = NotFound desc = could not find container \"246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041\": container with ID starting with 246de53402db21720c8901d6eb139e9d644eaa0de08a0deaeb36af445ab75041 not found: ID does not exist" Apr 24 16:52:45.585681 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585659 2573 scope.go:117] "RemoveContainer" containerID="147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01" Apr 24 16:52:45.585844 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:52:45.585827 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01\": container with ID starting with 147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01 not found: ID does not exist" containerID="147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01" Apr 24 16:52:45.585881 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:45.585848 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01"} err="failed to get container status \"147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01\": rpc error: code = NotFound desc = could not find container \"147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01\": container with ID starting with 147d5e79c19b18c957e8fa17863e5681560471d294e57d22e5608a0444192b01 not found: ID does not exist" Apr 24 16:52:46.216952 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:46.216915 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" path="/var/lib/kubelet/pods/9131647b-b423-4c5a-9e80-c66ad6b211fe/volumes" Apr 24 16:52:48.496477 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:48.496438 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:48.496945 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:48.496917 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:52:58.496958 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:58.496913 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:52:58.497463 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:52:58.497345 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:53:08.497122 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:08.497079 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:53:08.497561 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:08.497544 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:53:18.496987 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:18.496938 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:53:18.497392 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:18.497362 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:53:28.497552 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:28.497510 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:53:28.497951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:28.497626 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:53:40.365411 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.365373 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg"] Apr 24 16:53:40.365976 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.365866 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" containerID="cri-o://f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8" gracePeriod=30 Apr 24 16:53:40.365976 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.365943 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" containerID="cri-o://1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138" gracePeriod=30 Apr 24 16:53:40.365976 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.365943 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" containerID="cri-o://5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050" gracePeriod=30 Apr 24 16:53:40.427515 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427483 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k"] Apr 24 16:53:40.427790 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427777 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="storage-initializer" Apr 24 16:53:40.427849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427792 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="storage-initializer" Apr 24 16:53:40.427849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427801 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" Apr 24 16:53:40.427849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427807 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" Apr 24 16:53:40.427849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427825 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" Apr 24 16:53:40.427849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427835 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" Apr 24 16:53:40.427849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427852 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" Apr 24 16:53:40.428028 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427858 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" Apr 24 16:53:40.428028 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427909 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kserve-container" Apr 24 16:53:40.428028 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427918 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="kube-rbac-proxy" Apr 24 16:53:40.428028 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.427926 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9131647b-b423-4c5a-9e80-c66ad6b211fe" containerName="agent" Apr 24 16:53:40.430996 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.430979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.434183 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.434160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 24 16:53:40.434711 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.434688 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 24 16:53:40.446209 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.446170 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k"] Apr 24 16:53:40.523898 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.523859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.524069 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.523906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.524069 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.523937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9qj\" (UniqueName: \"kubernetes.io/projected/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-kube-api-access-ph9qj\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.624522 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.624440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.624522 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.624484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.624754 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:53:40.624576 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 24 16:53:40.624754 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.624605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9qj\" (UniqueName: \"kubernetes.io/projected/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-kube-api-access-ph9qj\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.624754 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:53:40.624625 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls podName:2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8 nodeName:}" failed. No retries permitted until 2026-04-24 16:53:41.124609582 +0000 UTC m=+879.389324958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-g257k" (UID: "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8") : secret "message-dumper-predictor-serving-cert" not found Apr 24 16:53:40.625067 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.625046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.636870 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.636844 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9qj\" (UniqueName: \"kubernetes.io/projected/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-kube-api-access-ph9qj\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:40.715847 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.715813 2573 generic.go:358] "Generic (PLEG): container finished" podID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerID="1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138" exitCode=2 Apr 24 16:53:40.716003 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:40.715872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerDied","Data":"1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138"} Apr 24 16:53:41.130135 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:41.130094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:41.132741 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:41.132709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-g257k\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:41.341635 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:41.341595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:41.476752 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:41.476572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k"] Apr 24 16:53:41.479569 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:53:41.479529 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1bfbec_83f2_4e90_8428_8e6aea3cf2f8.slice/crio-9798686173aadb55cfd955396db65e2815658d13509376c0975ac177adc589a3 WatchSource:0}: Error finding container 9798686173aadb55cfd955396db65e2815658d13509376c0975ac177adc589a3: Status 404 returned error can't find the container with id 9798686173aadb55cfd955396db65e2815658d13509376c0975ac177adc589a3 Apr 24 16:53:41.722231 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:41.722120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" event={"ID":"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8","Type":"ContainerStarted","Data":"9798686173aadb55cfd955396db65e2815658d13509376c0975ac177adc589a3"} Apr 24 16:53:42.732240 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:42.732194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" event={"ID":"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8","Type":"ContainerStarted","Data":"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c"} Apr 24 16:53:43.490840 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:43.490790 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:53:43.736877 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:43.736843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" event={"ID":"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8","Type":"ContainerStarted","Data":"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659"} Apr 24 16:53:43.737250 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:43.736980 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:43.756628 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:43.756531 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" podStartSLOduration=2.6279863150000002 podStartE2EDuration="3.756516964s" podCreationTimestamp="2026-04-24 16:53:40 +0000 UTC" firstStartedPulling="2026-04-24 16:53:41.481505403 +0000 UTC m=+879.746220778" lastFinishedPulling="2026-04-24 16:53:42.610036042 +0000 UTC m=+880.874751427" observedRunningTime="2026-04-24 16:53:43.755456568 +0000 UTC m=+882.020171990" watchObservedRunningTime="2026-04-24 16:53:43.756516964 +0000 UTC m=+882.021232363" Apr 24 16:53:44.740143 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:44.740112 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:44.741867 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:44.741843 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:45.745070 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:45.745034 2573 generic.go:358] "Generic (PLEG): container finished" podID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerID="f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8" exitCode=0 Apr 24 16:53:45.745448 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:45.745104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerDied","Data":"f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8"} Apr 24 16:53:48.491498 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:48.491450 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:53:48.497044 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:48.497016 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:53:48.497345 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:48.497300 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:53:51.752543 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:51.752466 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:53:53.490932 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:53.490885 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:53:53.491289 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:53.491024 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:53:58.490931 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:58.490883 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:53:58.496244 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:58.496216 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:53:58.496639 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:53:58.496615 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:00.486275 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.486235 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp"] Apr 24 16:54:00.489276 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.489252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.491601 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.491575 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 24 16:54:00.491721 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.491575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 24 16:54:00.501649 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.501626 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp"] Apr 24 16:54:00.600480 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.600439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-proxy-tls\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.600480 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.600480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5bmt\" (UniqueName: \"kubernetes.io/projected/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kube-api-access-h5bmt\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.600730 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.600575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.600775 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.600705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kserve-provision-location\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.701505 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.701471 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.701705 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.701588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kserve-provision-location\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.701705 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.701626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-proxy-tls\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.701705 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.701656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bmt\" (UniqueName: \"kubernetes.io/projected/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kube-api-access-h5bmt\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.702182 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.702153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kserve-provision-location\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.702411 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.702386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.704347 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.704328 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-proxy-tls\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.711030 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.711003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5bmt\" (UniqueName: \"kubernetes.io/projected/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kube-api-access-h5bmt\") pod \"isvc-logger-predictor-6b866b99d-x72wp\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.800233 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.800190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:00.928413 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:00.928354 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp"] Apr 24 16:54:01.793181 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:01.793143 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerStarted","Data":"c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7"} Apr 24 16:54:01.793181 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:01.793182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerStarted","Data":"c1cd54a1e4fd876639dd58add3a305a10399696b721a2fc1d03c90332d37bb03"} Apr 24 16:54:02.152962 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:02.152888 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:54:02.154040 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:02.154017 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:54:03.491696 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:03.491655 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:54:04.802765 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:04.802727 2573 generic.go:358] "Generic (PLEG): container finished" podID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerID="c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7" exitCode=0 Apr 24 16:54:04.803232 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:04.802812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerDied","Data":"c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7"} Apr 24 16:54:05.808203 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.808167 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerStarted","Data":"bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72"} Apr 24 16:54:05.808609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.808215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerStarted","Data":"6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367"} Apr 24 16:54:05.808609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.808227 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerStarted","Data":"6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889"} Apr 24 16:54:05.808609 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.808561 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:05.808729 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.808677 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:05.809836 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.809814 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:05.831328 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:05.831251 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podStartSLOduration=5.83123174 podStartE2EDuration="5.83123174s" podCreationTimestamp="2026-04-24 16:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:54:05.828440821 +0000 UTC m=+904.093156218" watchObservedRunningTime="2026-04-24 16:54:05.83123174 +0000 UTC m=+904.095947141" Apr 24 16:54:06.811679 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:06.811639 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:06.812130 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:06.811675 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:06.812614 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:06.812589 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:07.815037 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:07.814989 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:07.815468 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:07.815433 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:08.490813 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:08.490765 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:54:08.497267 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:08.497222 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:5000: connect: connection refused" Apr 24 16:54:08.497453 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:08.497376 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:54:08.497583 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:08.497558 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:08.497666 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:08.497655 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:54:10.516465 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.516436 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:54:10.694943 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.694843 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf81a-49e4-41c5-8815-abd1e5158d77-kserve-provision-location\") pod \"abaaf81a-49e4-41c5-8815-abd1e5158d77\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " Apr 24 16:54:10.694943 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.694894 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf81a-49e4-41c5-8815-abd1e5158d77-proxy-tls\") pod \"abaaf81a-49e4-41c5-8815-abd1e5158d77\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " Apr 24 16:54:10.694943 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.694924 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf81a-49e4-41c5-8815-abd1e5158d77-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"abaaf81a-49e4-41c5-8815-abd1e5158d77\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " Apr 24 16:54:10.695232 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.695015 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb8kl\" (UniqueName: \"kubernetes.io/projected/abaaf81a-49e4-41c5-8815-abd1e5158d77-kube-api-access-nb8kl\") pod \"abaaf81a-49e4-41c5-8815-abd1e5158d77\" (UID: \"abaaf81a-49e4-41c5-8815-abd1e5158d77\") " Apr 24 16:54:10.695333 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.695276 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abaaf81a-49e4-41c5-8815-abd1e5158d77-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "abaaf81a-49e4-41c5-8815-abd1e5158d77" (UID: "abaaf81a-49e4-41c5-8815-abd1e5158d77"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:54:10.695406 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.695374 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abaaf81a-49e4-41c5-8815-abd1e5158d77-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "abaaf81a-49e4-41c5-8815-abd1e5158d77" (UID: "abaaf81a-49e4-41c5-8815-abd1e5158d77"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:54:10.697263 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.697239 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abaaf81a-49e4-41c5-8815-abd1e5158d77-kube-api-access-nb8kl" (OuterVolumeSpecName: "kube-api-access-nb8kl") pod "abaaf81a-49e4-41c5-8815-abd1e5158d77" (UID: "abaaf81a-49e4-41c5-8815-abd1e5158d77"). InnerVolumeSpecName "kube-api-access-nb8kl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:54:10.697357 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.697279 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abaaf81a-49e4-41c5-8815-abd1e5158d77-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "abaaf81a-49e4-41c5-8815-abd1e5158d77" (UID: "abaaf81a-49e4-41c5-8815-abd1e5158d77"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:54:10.795883 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.795832 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nb8kl\" (UniqueName: \"kubernetes.io/projected/abaaf81a-49e4-41c5-8815-abd1e5158d77-kube-api-access-nb8kl\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:54:10.795883 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.795873 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abaaf81a-49e4-41c5-8815-abd1e5158d77-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:54:10.795883 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.795887 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abaaf81a-49e4-41c5-8815-abd1e5158d77-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:54:10.796117 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.795900 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/abaaf81a-49e4-41c5-8815-abd1e5158d77-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:54:10.825327 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.825278 2573 generic.go:358] "Generic (PLEG): container finished" podID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerID="5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050" exitCode=0 Apr 24 16:54:10.825486 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.825351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerDied","Data":"5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050"} Apr 24 16:54:10.825486 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.825381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" event={"ID":"abaaf81a-49e4-41c5-8815-abd1e5158d77","Type":"ContainerDied","Data":"b2977eeb7c95aee56f385ef9d1cb62900ec5dbcbcac2d408bdea3fc7703879c7"} Apr 24 16:54:10.825486 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.825400 2573 scope.go:117] "RemoveContainer" containerID="5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050" Apr 24 16:54:10.825486 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.825433 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg" Apr 24 16:54:10.834001 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.833982 2573 scope.go:117] "RemoveContainer" containerID="1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138" Apr 24 16:54:10.841533 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.841509 2573 scope.go:117] "RemoveContainer" containerID="f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8" Apr 24 16:54:10.848767 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.848638 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg"] Apr 24 16:54:10.848839 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.848767 2573 scope.go:117] "RemoveContainer" containerID="819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18" Apr 24 16:54:10.855857 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.855835 2573 scope.go:117] "RemoveContainer" containerID="5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050" Apr 24 16:54:10.856092 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:54:10.856075 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050\": container with ID starting with 5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050 not found: ID does not exist" containerID="5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050" Apr 24 16:54:10.856136 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856102 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050"} err="failed to get container status \"5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050\": rpc error: code = NotFound desc = could not find container \"5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050\": container with ID starting with 5da02bd86cbe4335d58d029b7e587536348d410677b81c3510f1eec99e674050 not found: ID does not exist" Apr 24 16:54:10.856136 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856121 2573 scope.go:117] "RemoveContainer" containerID="1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138" Apr 24 16:54:10.856352 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:54:10.856337 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138\": container with ID starting with 1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138 not found: ID does not exist" containerID="1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138" Apr 24 16:54:10.856396 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856360 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138"} err="failed to get container status \"1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138\": rpc error: code = NotFound desc = could not find container \"1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138\": container with ID starting with 1de01ba9187ea8a01a99bc5db0d7b0774bed3bc7baeb0df3087ebb20cbe2a138 not found: ID does not exist" Apr 24 16:54:10.856396 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856373 2573 scope.go:117] "RemoveContainer" containerID="f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8" Apr 24 16:54:10.856567 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:54:10.856547 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8\": container with ID starting with f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8 not found: ID does not exist" containerID="f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8" Apr 24 16:54:10.856606 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856573 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8"} err="failed to get container status \"f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8\": rpc error: code = NotFound desc = could not find container \"f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8\": container with ID starting with f28e5cd7ae7af9f07e3373bcd8f8022f3ed870ee4c19930c154e5720aa70f1c8 not found: ID does not exist" Apr 24 16:54:10.856606 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856591 2573 scope.go:117] "RemoveContainer" containerID="819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18" Apr 24 16:54:10.856840 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:54:10.856820 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18\": container with ID starting with 819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18 not found: ID does not exist" containerID="819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18" Apr 24 16:54:10.856907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.856845 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18"} err="failed to get container status \"819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18\": rpc error: code = NotFound desc = could not find container \"819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18\": container with ID starting with 819ebfc1393f2f5553621ad1b363db7237a5ff155e3ad78b9a42c029f6ac9a18 not found: ID does not exist" Apr 24 16:54:10.860294 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:10.860273 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7487fd8f4-jd5wg"] Apr 24 16:54:12.217560 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:12.217522 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" path="/var/lib/kubelet/pods/abaaf81a-49e4-41c5-8815-abd1e5158d77/volumes" Apr 24 16:54:12.819293 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:12.819265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:54:12.819924 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:12.819896 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:12.820271 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:12.820245 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:22.820082 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:22.820032 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:22.820530 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:22.820490 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:32.820420 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:32.820374 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:32.820893 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:32.820869 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:42.820835 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:42.820779 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:42.821350 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:42.821242 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:54:52.820537 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:52.820490 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:54:52.820993 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:54:52.820970 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:55:02.819986 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:02.819938 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:55:02.820468 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:02.820440 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:55:12.820531 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:12.820491 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:12.821009 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:12.820940 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:25.499744 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.499665 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-g257k_2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8/kserve-container/0.log" Apr 24 16:55:25.661816 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.661784 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k"] Apr 24 16:55:25.662138 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.662082 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kserve-container" containerID="cri-o://ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c" gracePeriod=30 Apr 24 16:55:25.662138 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.662106 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kube-rbac-proxy" containerID="cri-o://d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659" gracePeriod=30 Apr 24 16:55:25.724382 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.724349 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp"] Apr 24 16:55:25.724729 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.724693 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" containerID="cri-o://6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889" gracePeriod=30 Apr 24 16:55:25.724869 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.724727 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" containerID="cri-o://bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72" gracePeriod=30 Apr 24 16:55:25.724869 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.724757 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" containerID="cri-o://6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367" gracePeriod=30 Apr 24 16:55:25.752130 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752056 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5"] Apr 24 16:55:25.752402 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752389 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="storage-initializer" Apr 24 16:55:25.752452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752404 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="storage-initializer" Apr 24 16:55:25.752452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752412 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" Apr 24 16:55:25.752452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752418 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" Apr 24 16:55:25.752452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752436 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" Apr 24 16:55:25.752452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752442 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" Apr 24 16:55:25.752452 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752453 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" Apr 24 16:55:25.752623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752458 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" Apr 24 16:55:25.752623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752525 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kserve-container" Apr 24 16:55:25.752623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752535 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="agent" Apr 24 16:55:25.752623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.752544 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="abaaf81a-49e4-41c5-8815-abd1e5158d77" containerName="kube-rbac-proxy" Apr 24 16:55:25.755782 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.755761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.758022 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.757995 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 16:55:25.758135 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.758089 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 24 16:55:25.766780 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.766751 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5"] Apr 24 16:55:25.806960 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.806924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99c42237-83d3-4bd1-9809-bc21f0ca6219-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.807136 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.806990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99c42237-83d3-4bd1-9809-bc21f0ca6219-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.807136 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.807016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gznbw\" (UniqueName: \"kubernetes.io/projected/99c42237-83d3-4bd1-9809-bc21f0ca6219-kube-api-access-gznbw\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.807136 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.807042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99c42237-83d3-4bd1-9809-bc21f0ca6219-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.907622 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.907583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99c42237-83d3-4bd1-9809-bc21f0ca6219-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.907795 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.907634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gznbw\" (UniqueName: \"kubernetes.io/projected/99c42237-83d3-4bd1-9809-bc21f0ca6219-kube-api-access-gznbw\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.908139 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.908110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99c42237-83d3-4bd1-9809-bc21f0ca6219-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.908277 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.908256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99c42237-83d3-4bd1-9809-bc21f0ca6219-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.908534 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.908507 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99c42237-83d3-4bd1-9809-bc21f0ca6219-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.909073 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.909025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99c42237-83d3-4bd1-9809-bc21f0ca6219-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.910455 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.910434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99c42237-83d3-4bd1-9809-bc21f0ca6219-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.916796 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.916770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gznbw\" (UniqueName: \"kubernetes.io/projected/99c42237-83d3-4bd1-9809-bc21f0ca6219-kube-api-access-gznbw\") pod \"isvc-lightgbm-predictor-bdf964bd-566h5\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:25.953512 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:25.953491 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:55:26.009623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.009537 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph9qj\" (UniqueName: \"kubernetes.io/projected/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-kube-api-access-ph9qj\") pod \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " Apr 24 16:55:26.009623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.009580 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-message-dumper-kube-rbac-proxy-sar-config\") pod \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " Apr 24 16:55:26.009623 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.009601 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls\") pod \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\" (UID: \"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8\") " Apr 24 16:55:26.009984 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.009949 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" (UID: "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:55:26.011862 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.011837 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" (UID: "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:55:26.011862 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.011847 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-kube-api-access-ph9qj" (OuterVolumeSpecName: "kube-api-access-ph9qj") pod "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" (UID: "2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8"). InnerVolumeSpecName "kube-api-access-ph9qj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:55:26.043699 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043668 2573 generic.go:358] "Generic (PLEG): container finished" podID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerID="d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659" exitCode=2 Apr 24 16:55:26.043699 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043692 2573 generic.go:358] "Generic (PLEG): container finished" podID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerID="ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c" exitCode=2 Apr 24 16:55:26.043926 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043741 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" Apr 24 16:55:26.043926 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" event={"ID":"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8","Type":"ContainerDied","Data":"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659"} Apr 24 16:55:26.043926 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" event={"ID":"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8","Type":"ContainerDied","Data":"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c"} Apr 24 16:55:26.043926 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k" event={"ID":"2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8","Type":"ContainerDied","Data":"9798686173aadb55cfd955396db65e2815658d13509376c0975ac177adc589a3"} Apr 24 16:55:26.043926 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.043823 2573 scope.go:117] "RemoveContainer" containerID="d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659" Apr 24 16:55:26.046128 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.046107 2573 generic.go:358] "Generic (PLEG): container finished" podID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerID="6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367" exitCode=2 Apr 24 16:55:26.046250 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.046168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerDied","Data":"6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367"} Apr 24 16:55:26.052360 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.052333 2573 scope.go:117] "RemoveContainer" containerID="ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c" Apr 24 16:55:26.059237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.059220 2573 scope.go:117] "RemoveContainer" containerID="d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659" Apr 24 16:55:26.059521 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:55:26.059503 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659\": container with ID starting with d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659 not found: ID does not exist" containerID="d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659" Apr 24 16:55:26.059620 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.059531 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659"} err="failed to get container status \"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659\": rpc error: code = NotFound desc = could not find container \"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659\": container with ID starting with d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659 not found: ID does not exist" Apr 24 16:55:26.059620 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.059549 2573 scope.go:117] "RemoveContainer" containerID="ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c" Apr 24 16:55:26.059818 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:55:26.059800 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c\": container with ID starting with ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c not found: ID does not exist" containerID="ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c" Apr 24 16:55:26.059860 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.059824 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c"} err="failed to get container status \"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c\": rpc error: code = NotFound desc = could not find container \"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c\": container with ID starting with ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c not found: ID does not exist" Apr 24 16:55:26.059860 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.059840 2573 scope.go:117] "RemoveContainer" containerID="d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659" Apr 24 16:55:26.060070 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.060051 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659"} err="failed to get container status \"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659\": rpc error: code = NotFound desc = could not find container \"d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659\": container with ID starting with d39fee5596ae4e176585d25e7837d3c845ad8b59d3878535b302cb93785fc659 not found: ID does not exist" Apr 24 16:55:26.060115 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.060071 2573 scope.go:117] "RemoveContainer" containerID="ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c" Apr 24 16:55:26.060284 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.060266 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c"} err="failed to get container status \"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c\": rpc error: code = NotFound desc = could not find container \"ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c\": container with ID starting with ce54143b4a5d483e48036c5f61adf0722fe649832628b53c95fca8da9e45d99c not found: ID does not exist" Apr 24 16:55:26.066180 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.066164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:26.085691 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.085651 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k"] Apr 24 16:55:26.094897 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.094864 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-g257k"] Apr 24 16:55:26.111039 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.111008 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ph9qj\" (UniqueName: \"kubernetes.io/projected/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-kube-api-access-ph9qj\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:26.111039 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.111041 2573 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:26.111432 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.111056 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:26.192891 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.192864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5"] Apr 24 16:55:26.195387 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:55:26.195354 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c42237_83d3_4bd1_9809_bc21f0ca6219.slice/crio-37eaf0adcb6458d805860252aad99ebc89b102dc6c4925e7035a55fa6091a313 WatchSource:0}: Error finding container 37eaf0adcb6458d805860252aad99ebc89b102dc6c4925e7035a55fa6091a313: Status 404 returned error can't find the container with id 37eaf0adcb6458d805860252aad99ebc89b102dc6c4925e7035a55fa6091a313 Apr 24 16:55:26.197159 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.197141 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:55:26.217155 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:26.217128 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" path="/var/lib/kubelet/pods/2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8/volumes" Apr 24 16:55:27.055521 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:27.055484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerStarted","Data":"f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8"} Apr 24 16:55:27.055521 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:27.055521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerStarted","Data":"37eaf0adcb6458d805860252aad99ebc89b102dc6c4925e7035a55fa6091a313"} Apr 24 16:55:27.815822 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:27.815772 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 16:55:30.065918 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:30.065885 2573 generic.go:358] "Generic (PLEG): container finished" podID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerID="f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8" exitCode=0 Apr 24 16:55:30.065918 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:30.065930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerDied","Data":"f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8"} Apr 24 16:55:31.072181 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:31.072046 2573 generic.go:358] "Generic (PLEG): container finished" podID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerID="6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889" exitCode=0 Apr 24 16:55:31.072181 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:31.072128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerDied","Data":"6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889"} Apr 24 16:55:32.815912 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:32.815860 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 16:55:32.820545 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:32.820504 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:55:32.822226 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:32.822168 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:55:37.093093 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:37.093056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerStarted","Data":"8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303"} Apr 24 16:55:37.093093 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:37.093099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerStarted","Data":"b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc"} Apr 24 16:55:37.093529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:37.093298 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:37.114869 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:37.114821 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podStartSLOduration=5.845264877 podStartE2EDuration="12.114806511s" podCreationTimestamp="2026-04-24 16:55:25 +0000 UTC" firstStartedPulling="2026-04-24 16:55:30.067226856 +0000 UTC m=+988.331942232" lastFinishedPulling="2026-04-24 16:55:36.336768475 +0000 UTC m=+994.601483866" observedRunningTime="2026-04-24 16:55:37.113131385 +0000 UTC m=+995.377846782" watchObservedRunningTime="2026-04-24 16:55:37.114806511 +0000 UTC m=+995.379521910" Apr 24 16:55:37.816182 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:37.816126 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 16:55:37.816405 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:37.816282 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:38.097207 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:38.097121 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:38.098385 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:38.098358 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:55:39.099748 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:39.099707 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:55:42.815120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:42.815084 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 16:55:42.820666 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:42.820639 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:55:42.822250 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:42.822219 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:55:44.104726 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:44.104692 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:55:44.105326 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:44.105280 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:55:47.815759 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:47.815711 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 16:55:52.815509 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:52.815458 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 24 16:55:52.820809 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:52.820783 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 24 16:55:52.820931 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:52.820917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:52.821341 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:52.821293 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:55:52.821431 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:52.821420 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:54.105717 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:54.105678 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:55:55.912467 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:55.912438 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:56.080920 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.080866 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-isvc-logger-kube-rbac-proxy-sar-config\") pod \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " Apr 24 16:55:56.081125 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.080981 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-proxy-tls\") pod \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " Apr 24 16:55:56.081125 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.081026 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5bmt\" (UniqueName: \"kubernetes.io/projected/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kube-api-access-h5bmt\") pod \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " Apr 24 16:55:56.081125 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.081060 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kserve-provision-location\") pod \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\" (UID: \"c7ac697c-8c16-4f87-8a8b-4ea450e0577e\") " Apr 24 16:55:56.081413 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.081383 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "c7ac697c-8c16-4f87-8a8b-4ea450e0577e" (UID: "c7ac697c-8c16-4f87-8a8b-4ea450e0577e"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:55:56.081413 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.081388 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7ac697c-8c16-4f87-8a8b-4ea450e0577e" (UID: "c7ac697c-8c16-4f87-8a8b-4ea450e0577e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:55:56.083373 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.083348 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7ac697c-8c16-4f87-8a8b-4ea450e0577e" (UID: "c7ac697c-8c16-4f87-8a8b-4ea450e0577e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:55:56.083447 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.083340 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kube-api-access-h5bmt" (OuterVolumeSpecName: "kube-api-access-h5bmt") pod "c7ac697c-8c16-4f87-8a8b-4ea450e0577e" (UID: "c7ac697c-8c16-4f87-8a8b-4ea450e0577e"). InnerVolumeSpecName "kube-api-access-h5bmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:55:56.150867 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.150835 2573 generic.go:358] "Generic (PLEG): container finished" podID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerID="bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72" exitCode=0 Apr 24 16:55:56.151041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.150922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerDied","Data":"bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72"} Apr 24 16:55:56.151041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.150969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" event={"ID":"c7ac697c-8c16-4f87-8a8b-4ea450e0577e","Type":"ContainerDied","Data":"c1cd54a1e4fd876639dd58add3a305a10399696b721a2fc1d03c90332d37bb03"} Apr 24 16:55:56.151041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.150989 2573 scope.go:117] "RemoveContainer" containerID="bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72" Apr 24 16:55:56.151041 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.150934 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp" Apr 24 16:55:56.159907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.159886 2573 scope.go:117] "RemoveContainer" containerID="6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367" Apr 24 16:55:56.167431 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.167410 2573 scope.go:117] "RemoveContainer" containerID="6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889" Apr 24 16:55:56.173633 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.173607 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp"] Apr 24 16:55:56.175295 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.175280 2573 scope.go:117] "RemoveContainer" containerID="c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7" Apr 24 16:55:56.177667 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.177644 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-6b866b99d-x72wp"] Apr 24 16:55:56.182036 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.182016 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:56.182120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.182064 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:56.182120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.182076 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5bmt\" (UniqueName: \"kubernetes.io/projected/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kube-api-access-h5bmt\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:56.182120 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.182086 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7ac697c-8c16-4f87-8a8b-4ea450e0577e-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:55:56.183008 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.182995 2573 scope.go:117] "RemoveContainer" containerID="bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72" Apr 24 16:55:56.183269 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:55:56.183252 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72\": container with ID starting with bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72 not found: ID does not exist" containerID="bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72" Apr 24 16:55:56.183329 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183278 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72"} err="failed to get container status \"bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72\": rpc error: code = NotFound desc = could not find container \"bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72\": container with ID starting with bb2b3eba041a08f7a166dceb500c24eff97f06b24b9f87e6d4b2522bc277ad72 not found: ID does not exist" Apr 24 16:55:56.183329 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183294 2573 scope.go:117] "RemoveContainer" containerID="6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367" Apr 24 16:55:56.183542 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:55:56.183516 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367\": container with ID starting with 6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367 not found: ID does not exist" containerID="6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367" Apr 24 16:55:56.183592 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183550 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367"} err="failed to get container status \"6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367\": rpc error: code = NotFound desc = could not find container \"6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367\": container with ID starting with 6da1805b9598e5519c5fa03f115891cda18120696b5bef0aee9a19a84dc4f367 not found: ID does not exist" Apr 24 16:55:56.183592 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183571 2573 scope.go:117] "RemoveContainer" containerID="6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889" Apr 24 16:55:56.183757 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:55:56.183742 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889\": container with ID starting with 6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889 not found: ID does not exist" containerID="6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889" Apr 24 16:55:56.183799 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183759 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889"} err="failed to get container status \"6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889\": rpc error: code = NotFound desc = could not find container \"6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889\": container with ID starting with 6764efdc15398629b854ed53e0a6315e4f608ff8428ff4e05da659eb242c3889 not found: ID does not exist" Apr 24 16:55:56.183799 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183771 2573 scope.go:117] "RemoveContainer" containerID="c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7" Apr 24 16:55:56.183974 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:55:56.183950 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7\": container with ID starting with c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7 not found: ID does not exist" containerID="c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7" Apr 24 16:55:56.184035 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.183983 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7"} err="failed to get container status \"c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7\": rpc error: code = NotFound desc = could not find container \"c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7\": container with ID starting with c2229b6d05c458cf1a6005ddd3c2d6e6968226c8c48d9bd7203cd4649ae20fb7 not found: ID does not exist" Apr 24 16:55:56.221863 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:55:56.221824 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" path="/var/lib/kubelet/pods/c7ac697c-8c16-4f87-8a8b-4ea450e0577e/volumes" Apr 24 16:56:04.105614 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:04.105564 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:56:14.105991 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:14.105951 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:56:24.106016 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:24.105973 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:56:34.106049 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:34.106013 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:56:44.105297 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:44.105252 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 24 16:56:54.106513 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:54.106477 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:56:55.924392 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:55.924362 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5"] Apr 24 16:56:55.924746 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:55.924654 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" containerID="cri-o://b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc" gracePeriod=30 Apr 24 16:56:55.924746 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:55.924690 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kube-rbac-proxy" containerID="cri-o://8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303" gracePeriod=30 Apr 24 16:56:56.037348 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037303 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b"] Apr 24 16:56:56.037610 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037597 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="storage-initializer" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037612 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="storage-initializer" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037622 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037628 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037637 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kube-rbac-proxy" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037643 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kube-rbac-proxy" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037650 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kserve-container" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037655 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kserve-container" Apr 24 16:56:56.037662 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037661 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037667 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037676 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037681 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037740 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kserve-container" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037747 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="kube-rbac-proxy" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037752 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kserve-container" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037760 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7ac697c-8c16-4f87-8a8b-4ea450e0577e" containerName="agent" Apr 24 16:56:56.037884 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.037767 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a1bfbec-83f2-4e90-8428-8e6aea3cf2f8" containerName="kube-rbac-proxy" Apr 24 16:56:56.040758 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.040740 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.046269 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.046252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 24 16:56:56.046511 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.046494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 24 16:56:56.054992 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.054971 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b"] Apr 24 16:56:56.155981 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.155952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247f3cb9-3d53-4846-b130-07ee73ce1088-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.156116 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.155994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsj96\" (UniqueName: \"kubernetes.io/projected/247f3cb9-3d53-4846-b130-07ee73ce1088-kube-api-access-bsj96\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.156116 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.156014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/247f3cb9-3d53-4846-b130-07ee73ce1088-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.156116 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.156094 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/247f3cb9-3d53-4846-b130-07ee73ce1088-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.257225 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.257192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsj96\" (UniqueName: \"kubernetes.io/projected/247f3cb9-3d53-4846-b130-07ee73ce1088-kube-api-access-bsj96\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.257405 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.257228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/247f3cb9-3d53-4846-b130-07ee73ce1088-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.257405 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.257252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/247f3cb9-3d53-4846-b130-07ee73ce1088-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.257405 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.257295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247f3cb9-3d53-4846-b130-07ee73ce1088-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.257716 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.257695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/247f3cb9-3d53-4846-b130-07ee73ce1088-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.257822 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.257803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/247f3cb9-3d53-4846-b130-07ee73ce1088-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.259694 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.259677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247f3cb9-3d53-4846-b130-07ee73ce1088-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.269290 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.269268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsj96\" (UniqueName: \"kubernetes.io/projected/247f3cb9-3d53-4846-b130-07ee73ce1088-kube-api-access-bsj96\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.316585 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.316558 2573 generic.go:358] "Generic (PLEG): container finished" podID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerID="8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303" exitCode=2 Apr 24 16:56:56.316676 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.316631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerDied","Data":"8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303"} Apr 24 16:56:56.350786 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.350761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:56:56.473931 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:56.473898 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b"] Apr 24 16:56:56.479361 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:56:56.477137 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod247f3cb9_3d53_4846_b130_07ee73ce1088.slice/crio-6af119029c3950db4b2994e7f69bd09d55c19c6c0b4d2dc1e49c5ebe36c75ffa WatchSource:0}: Error finding container 6af119029c3950db4b2994e7f69bd09d55c19c6c0b4d2dc1e49c5ebe36c75ffa: Status 404 returned error can't find the container with id 6af119029c3950db4b2994e7f69bd09d55c19c6c0b4d2dc1e49c5ebe36c75ffa Apr 24 16:56:57.320069 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:57.320026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerStarted","Data":"11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba"} Apr 24 16:56:57.320069 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:57.320071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerStarted","Data":"6af119029c3950db4b2994e7f69bd09d55c19c6c0b4d2dc1e49c5ebe36c75ffa"} Apr 24 16:56:59.100495 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:56:59.100451 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 24 16:57:00.959850 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:00.959826 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:57:01.095971 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.095938 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99c42237-83d3-4bd1-9809-bc21f0ca6219-kserve-provision-location\") pod \"99c42237-83d3-4bd1-9809-bc21f0ca6219\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " Apr 24 16:57:01.096156 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.095992 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99c42237-83d3-4bd1-9809-bc21f0ca6219-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"99c42237-83d3-4bd1-9809-bc21f0ca6219\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " Apr 24 16:57:01.096156 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.096046 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99c42237-83d3-4bd1-9809-bc21f0ca6219-proxy-tls\") pod \"99c42237-83d3-4bd1-9809-bc21f0ca6219\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " Apr 24 16:57:01.096156 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.096097 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gznbw\" (UniqueName: \"kubernetes.io/projected/99c42237-83d3-4bd1-9809-bc21f0ca6219-kube-api-access-gznbw\") pod \"99c42237-83d3-4bd1-9809-bc21f0ca6219\" (UID: \"99c42237-83d3-4bd1-9809-bc21f0ca6219\") " Apr 24 16:57:01.096303 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.096274 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c42237-83d3-4bd1-9809-bc21f0ca6219-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99c42237-83d3-4bd1-9809-bc21f0ca6219" (UID: "99c42237-83d3-4bd1-9809-bc21f0ca6219"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:57:01.096432 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.096411 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99c42237-83d3-4bd1-9809-bc21f0ca6219-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "99c42237-83d3-4bd1-9809-bc21f0ca6219" (UID: "99c42237-83d3-4bd1-9809-bc21f0ca6219"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:57:01.098169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.098150 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99c42237-83d3-4bd1-9809-bc21f0ca6219-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99c42237-83d3-4bd1-9809-bc21f0ca6219" (UID: "99c42237-83d3-4bd1-9809-bc21f0ca6219"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:01.098294 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.098274 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c42237-83d3-4bd1-9809-bc21f0ca6219-kube-api-access-gznbw" (OuterVolumeSpecName: "kube-api-access-gznbw") pod "99c42237-83d3-4bd1-9809-bc21f0ca6219" (UID: "99c42237-83d3-4bd1-9809-bc21f0ca6219"). InnerVolumeSpecName "kube-api-access-gznbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:57:01.196817 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.196768 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gznbw\" (UniqueName: \"kubernetes.io/projected/99c42237-83d3-4bd1-9809-bc21f0ca6219-kube-api-access-gznbw\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:57:01.196817 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.196813 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99c42237-83d3-4bd1-9809-bc21f0ca6219-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:57:01.196817 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.196828 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99c42237-83d3-4bd1-9809-bc21f0ca6219-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:57:01.197015 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.196843 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99c42237-83d3-4bd1-9809-bc21f0ca6219-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:57:01.332907 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.332873 2573 generic.go:358] "Generic (PLEG): container finished" podID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerID="b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc" exitCode=0 Apr 24 16:57:01.333068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.332968 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" Apr 24 16:57:01.333068 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.332962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerDied","Data":"b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc"} Apr 24 16:57:01.333179 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.333079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5" event={"ID":"99c42237-83d3-4bd1-9809-bc21f0ca6219","Type":"ContainerDied","Data":"37eaf0adcb6458d805860252aad99ebc89b102dc6c4925e7035a55fa6091a313"} Apr 24 16:57:01.333179 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.333107 2573 scope.go:117] "RemoveContainer" containerID="8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303" Apr 24 16:57:01.334358 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.334336 2573 generic.go:358] "Generic (PLEG): container finished" podID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerID="11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba" exitCode=0 Apr 24 16:57:01.334475 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.334433 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerDied","Data":"11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba"} Apr 24 16:57:01.341343 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.341207 2573 scope.go:117] "RemoveContainer" containerID="b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc" Apr 24 16:57:01.349192 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.349177 2573 scope.go:117] "RemoveContainer" containerID="f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8" Apr 24 16:57:01.356029 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.356009 2573 scope.go:117] "RemoveContainer" containerID="8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303" Apr 24 16:57:01.356274 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:57:01.356257 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303\": container with ID starting with 8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303 not found: ID does not exist" containerID="8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303" Apr 24 16:57:01.356459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.356279 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303"} err="failed to get container status \"8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303\": rpc error: code = NotFound desc = could not find container \"8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303\": container with ID starting with 8cd96b45f2814b23a270d2fd608c142641f94af419ea86c9a8f3add4c0926303 not found: ID does not exist" Apr 24 16:57:01.356459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.356296 2573 scope.go:117] "RemoveContainer" containerID="b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc" Apr 24 16:57:01.356572 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:57:01.356551 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc\": container with ID starting with b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc not found: ID does not exist" containerID="b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc" Apr 24 16:57:01.356610 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.356579 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc"} err="failed to get container status \"b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc\": rpc error: code = NotFound desc = could not find container \"b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc\": container with ID starting with b3bb87dd884196200927da7400e789928d3b399e4df78536cc68c615225384dc not found: ID does not exist" Apr 24 16:57:01.356610 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.356596 2573 scope.go:117] "RemoveContainer" containerID="f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8" Apr 24 16:57:01.356868 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:57:01.356849 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8\": container with ID starting with f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8 not found: ID does not exist" containerID="f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8" Apr 24 16:57:01.356925 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.356879 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8"} err="failed to get container status \"f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8\": rpc error: code = NotFound desc = could not find container \"f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8\": container with ID starting with f3e204114eb1a64d92a14882e1d77a09053d0ddb7d484bd5dab61fa69bc83bd8 not found: ID does not exist" Apr 24 16:57:01.371642 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.371622 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5"] Apr 24 16:57:01.374569 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:01.374549 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-566h5"] Apr 24 16:57:02.216288 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:02.216252 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" path="/var/lib/kubelet/pods/99c42237-83d3-4bd1-9809-bc21f0ca6219/volumes" Apr 24 16:57:02.339302 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:02.339266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerStarted","Data":"422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e"} Apr 24 16:57:02.339302 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:02.339325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerStarted","Data":"602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042"} Apr 24 16:57:02.339554 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:02.339536 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:57:02.360885 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:02.360832 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podStartSLOduration=6.360818977 podStartE2EDuration="6.360818977s" podCreationTimestamp="2026-04-24 16:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:02.359706233 +0000 UTC m=+1080.624421628" watchObservedRunningTime="2026-04-24 16:57:02.360818977 +0000 UTC m=+1080.625534373" Apr 24 16:57:03.342547 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:03.342514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:57:03.343656 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:03.343626 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:04.345324 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:04.345277 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:09.349488 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:09.349458 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:57:09.350007 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:09.349979 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:19.350161 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:19.350120 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:29.350574 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:29.350532 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:39.350689 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:39.350647 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:49.350555 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:49.350511 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:57:59.350626 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:57:59.350584 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:58:09.350116 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:09.350075 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:58:19.350476 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:19.350444 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:58:26.699249 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.699176 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b"] Apr 24 16:58:26.699679 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.699524 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" containerID="cri-o://602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042" gracePeriod=30 Apr 24 16:58:26.699679 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.699560 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kube-rbac-proxy" containerID="cri-o://422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e" gracePeriod=30 Apr 24 16:58:26.844529 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844495 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2"] Apr 24 16:58:26.844835 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844818 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kube-rbac-proxy" Apr 24 16:58:26.844910 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844838 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kube-rbac-proxy" Apr 24 16:58:26.844910 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844858 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="storage-initializer" Apr 24 16:58:26.844910 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844867 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="storage-initializer" Apr 24 16:58:26.844910 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844889 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" Apr 24 16:58:26.844910 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844897 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" Apr 24 16:58:26.845162 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844979 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kserve-container" Apr 24 16:58:26.845162 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.844993 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="99c42237-83d3-4bd1-9809-bc21f0ca6219" containerName="kube-rbac-proxy" Apr 24 16:58:26.848019 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.847998 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:26.850576 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.850558 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 24 16:58:26.850792 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.850777 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 16:58:26.859010 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.858989 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2"] Apr 24 16:58:26.993440 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.993411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92f2\" (UniqueName: \"kubernetes.io/projected/3b319f88-1f50-429e-ab49-0e2342ca2e35-kube-api-access-p92f2\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:26.993584 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.993456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3b319f88-1f50-429e-ab49-0e2342ca2e35-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:26.993584 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.993526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b319f88-1f50-429e-ab49-0e2342ca2e35-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:26.993654 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:26.993585 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b319f88-1f50-429e-ab49-0e2342ca2e35-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.094974 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.094939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92f2\" (UniqueName: \"kubernetes.io/projected/3b319f88-1f50-429e-ab49-0e2342ca2e35-kube-api-access-p92f2\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.095124 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.094984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3b319f88-1f50-429e-ab49-0e2342ca2e35-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.095124 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.095014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b319f88-1f50-429e-ab49-0e2342ca2e35-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.095124 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.095040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b319f88-1f50-429e-ab49-0e2342ca2e35-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.095448 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.095431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b319f88-1f50-429e-ab49-0e2342ca2e35-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.095784 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.095756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3b319f88-1f50-429e-ab49-0e2342ca2e35-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.097612 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.097595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b319f88-1f50-429e-ab49-0e2342ca2e35-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.103459 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.103440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92f2\" (UniqueName: \"kubernetes.io/projected/3b319f88-1f50-429e-ab49-0e2342ca2e35-kube-api-access-p92f2\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.157363 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.157342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 16:58:27.283283 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.283216 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2"] Apr 24 16:58:27.286291 ip-10-0-142-182 kubenswrapper[2573]: W0424 16:58:27.286265 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b319f88_1f50_429e_ab49_0e2342ca2e35.slice/crio-b6e097e6ab84c50cf30128133c889aa1568b48c90c0e4e1b1f99d66ad74b3913 WatchSource:0}: Error finding container b6e097e6ab84c50cf30128133c889aa1568b48c90c0e4e1b1f99d66ad74b3913: Status 404 returned error can't find the container with id b6e097e6ab84c50cf30128133c889aa1568b48c90c0e4e1b1f99d66ad74b3913 Apr 24 16:58:27.580556 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.580474 2573 generic.go:358] "Generic (PLEG): container finished" podID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerID="422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e" exitCode=2 Apr 24 16:58:27.580556 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.580542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerDied","Data":"422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e"} Apr 24 16:58:27.581878 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.581843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerStarted","Data":"db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617"} Apr 24 16:58:27.581878 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:27.581869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerStarted","Data":"b6e097e6ab84c50cf30128133c889aa1568b48c90c0e4e1b1f99d66ad74b3913"} Apr 24 16:58:29.345951 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:29.345899 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 24 16:58:29.350090 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:29.350062 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 24 16:58:31.593970 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:31.593931 2573 generic.go:358] "Generic (PLEG): container finished" podID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerID="db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617" exitCode=0 Apr 24 16:58:31.594366 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:31.594010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerDied","Data":"db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617"} Apr 24 16:58:31.937564 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:31.937538 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:58:32.033573 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.033533 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/247f3cb9-3d53-4846-b130-07ee73ce1088-kserve-provision-location\") pod \"247f3cb9-3d53-4846-b130-07ee73ce1088\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " Apr 24 16:58:32.033849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.033625 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247f3cb9-3d53-4846-b130-07ee73ce1088-proxy-tls\") pod \"247f3cb9-3d53-4846-b130-07ee73ce1088\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " Apr 24 16:58:32.033849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.033664 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/247f3cb9-3d53-4846-b130-07ee73ce1088-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"247f3cb9-3d53-4846-b130-07ee73ce1088\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " Apr 24 16:58:32.033849 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.033742 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsj96\" (UniqueName: \"kubernetes.io/projected/247f3cb9-3d53-4846-b130-07ee73ce1088-kube-api-access-bsj96\") pod \"247f3cb9-3d53-4846-b130-07ee73ce1088\" (UID: \"247f3cb9-3d53-4846-b130-07ee73ce1088\") " Apr 24 16:58:32.034007 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.033906 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247f3cb9-3d53-4846-b130-07ee73ce1088-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "247f3cb9-3d53-4846-b130-07ee73ce1088" (UID: "247f3cb9-3d53-4846-b130-07ee73ce1088"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:58:32.034264 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.034234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247f3cb9-3d53-4846-b130-07ee73ce1088-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "247f3cb9-3d53-4846-b130-07ee73ce1088" (UID: "247f3cb9-3d53-4846-b130-07ee73ce1088"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:58:32.036467 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.036363 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247f3cb9-3d53-4846-b130-07ee73ce1088-kube-api-access-bsj96" (OuterVolumeSpecName: "kube-api-access-bsj96") pod "247f3cb9-3d53-4846-b130-07ee73ce1088" (UID: "247f3cb9-3d53-4846-b130-07ee73ce1088"). InnerVolumeSpecName "kube-api-access-bsj96". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:58:32.037211 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.037183 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247f3cb9-3d53-4846-b130-07ee73ce1088-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "247f3cb9-3d53-4846-b130-07ee73ce1088" (UID: "247f3cb9-3d53-4846-b130-07ee73ce1088"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:32.135381 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.135232 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/247f3cb9-3d53-4846-b130-07ee73ce1088-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:58:32.135381 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.135272 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/247f3cb9-3d53-4846-b130-07ee73ce1088-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:58:32.135381 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.135289 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsj96\" (UniqueName: \"kubernetes.io/projected/247f3cb9-3d53-4846-b130-07ee73ce1088-kube-api-access-bsj96\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:58:32.135669 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.135511 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/247f3cb9-3d53-4846-b130-07ee73ce1088-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 16:58:32.601169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.600756 2573 generic.go:358] "Generic (PLEG): container finished" podID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerID="602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042" exitCode=0 Apr 24 16:58:32.601169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.600858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerDied","Data":"602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042"} Apr 24 16:58:32.601169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.600903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" event={"ID":"247f3cb9-3d53-4846-b130-07ee73ce1088","Type":"ContainerDied","Data":"6af119029c3950db4b2994e7f69bd09d55c19c6c0b4d2dc1e49c5ebe36c75ffa"} Apr 24 16:58:32.601169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.600923 2573 scope.go:117] "RemoveContainer" containerID="422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e" Apr 24 16:58:32.601169 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.601102 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b" Apr 24 16:58:32.615856 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.615830 2573 scope.go:117] "RemoveContainer" containerID="602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042" Apr 24 16:58:32.627237 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.627199 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b"] Apr 24 16:58:32.630355 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.630327 2573 scope.go:117] "RemoveContainer" containerID="11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba" Apr 24 16:58:32.632782 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.632701 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-bgq6b"] Apr 24 16:58:32.643578 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.643556 2573 scope.go:117] "RemoveContainer" containerID="422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e" Apr 24 16:58:32.644348 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:58:32.644227 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e\": container with ID starting with 422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e not found: ID does not exist" containerID="422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e" Apr 24 16:58:32.644440 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.644338 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e"} err="failed to get container status \"422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e\": rpc error: code = NotFound desc = could not find container \"422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e\": container with ID starting with 422ab7c4a32d7e226c89e572e312bbb7986cd76645a1e61fa5f1f8844538266e not found: ID does not exist" Apr 24 16:58:32.644440 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.644364 2573 scope.go:117] "RemoveContainer" containerID="602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042" Apr 24 16:58:32.644822 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:58:32.644679 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042\": container with ID starting with 602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042 not found: ID does not exist" containerID="602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042" Apr 24 16:58:32.644822 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.644718 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042"} err="failed to get container status \"602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042\": rpc error: code = NotFound desc = could not find container \"602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042\": container with ID starting with 602a276bf2db8d4859c7f800f2efb73af8da4c299a99efa8100748cf1c57d042 not found: ID does not exist" Apr 24 16:58:32.644822 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.644743 2573 scope.go:117] "RemoveContainer" containerID="11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba" Apr 24 16:58:32.645220 ip-10-0-142-182 kubenswrapper[2573]: E0424 16:58:32.645152 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba\": container with ID starting with 11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba not found: ID does not exist" containerID="11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba" Apr 24 16:58:32.645220 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:32.645188 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba"} err="failed to get container status \"11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba\": rpc error: code = NotFound desc = could not find container \"11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba\": container with ID starting with 11fdeb0ddc3f523fda0777311af826c41738fe86670589307297ad546b39a6ba not found: ID does not exist" Apr 24 16:58:34.219191 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:58:34.218757 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" path="/var/lib/kubelet/pods/247f3cb9-3d53-4846-b130-07ee73ce1088/volumes" Apr 24 16:59:02.172640 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:59:02.172611 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 16:59:02.173203 ip-10-0-142-182 kubenswrapper[2573]: I0424 16:59:02.172685 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:00:40.646176 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:40.646155 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:00:41.010987 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:41.010952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerStarted","Data":"bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb"} Apr 24 17:00:41.010987 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:41.010991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerStarted","Data":"29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391"} Apr 24 17:00:41.011431 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:41.011108 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 17:00:41.011431 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:41.011170 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 17:00:41.042750 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:41.042705 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" podStartSLOduration=6.117991153 podStartE2EDuration="2m15.042689083s" podCreationTimestamp="2026-04-24 16:58:26 +0000 UTC" firstStartedPulling="2026-04-24 16:58:31.595080426 +0000 UTC m=+1169.859795801" lastFinishedPulling="2026-04-24 17:00:40.519778352 +0000 UTC m=+1298.784493731" observedRunningTime="2026-04-24 17:00:41.040969546 +0000 UTC m=+1299.305684942" watchObservedRunningTime="2026-04-24 17:00:41.042689083 +0000 UTC m=+1299.307404483" Apr 24 17:00:47.020192 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:00:47.020160 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 17:01:17.023741 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:17.023706 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 17:01:26.991450 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:26.991370 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2"] Apr 24 17:01:26.991808 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:26.991741 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kserve-container" containerID="cri-o://29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391" gracePeriod=30 Apr 24 17:01:26.991808 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:26.991780 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kube-rbac-proxy" containerID="cri-o://bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb" gracePeriod=30 Apr 24 17:01:27.015233 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.015196 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.27:8643/healthz\": dial tcp 10.134.0.27:8643: connect: connection refused" Apr 24 17:01:27.090439 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090403 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk"] Apr 24 17:01:27.090728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090712 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="storage-initializer" Apr 24 17:01:27.090728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090727 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="storage-initializer" Apr 24 17:01:27.090875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090747 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" Apr 24 17:01:27.090875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090753 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" Apr 24 17:01:27.090875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090762 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kube-rbac-proxy" Apr 24 17:01:27.090875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090770 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kube-rbac-proxy" Apr 24 17:01:27.090875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090819 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kube-rbac-proxy" Apr 24 17:01:27.090875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.090832 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="247f3cb9-3d53-4846-b130-07ee73ce1088" containerName="kserve-container" Apr 24 17:01:27.095902 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.095878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.097944 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.097917 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 24 17:01:27.098068 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.097967 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 17:01:27.105529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.105504 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk"] Apr 24 17:01:27.137728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.137689 2573 generic.go:358] "Generic (PLEG): container finished" podID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerID="bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb" exitCode=2 Apr 24 17:01:27.137893 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.137739 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerDied","Data":"bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb"} Apr 24 17:01:27.247264 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.247176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld26p\" (UniqueName: \"kubernetes.io/projected/b3745a76-0ca5-4b02-9515-1503bb14983e-kube-api-access-ld26p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.247264 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.247242 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3745a76-0ca5-4b02-9515-1503bb14983e-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.247480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.247297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3745a76-0ca5-4b02-9515-1503bb14983e-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.247480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.247358 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3745a76-0ca5-4b02-9515-1503bb14983e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.347881 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.347841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3745a76-0ca5-4b02-9515-1503bb14983e-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.348062 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.347911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3745a76-0ca5-4b02-9515-1503bb14983e-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.348062 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.347940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3745a76-0ca5-4b02-9515-1503bb14983e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.348184 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.348092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld26p\" (UniqueName: \"kubernetes.io/projected/b3745a76-0ca5-4b02-9515-1503bb14983e-kube-api-access-ld26p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.348813 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.348237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3745a76-0ca5-4b02-9515-1503bb14983e-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.349121 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.349095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3745a76-0ca5-4b02-9515-1503bb14983e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.350706 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.350686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3745a76-0ca5-4b02-9515-1503bb14983e-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.355542 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.355522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld26p\" (UniqueName: \"kubernetes.io/projected/b3745a76-0ca5-4b02-9515-1503bb14983e-kube-api-access-ld26p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.407947 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.407915 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:27.530363 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:27.530331 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk"] Apr 24 17:01:27.532960 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:01:27.532928 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3745a76_0ca5_4b02_9515_1503bb14983e.slice/crio-ae18e64420dc0822ea2d4d0f17ddb86e23c857afa8c0c33780510c8fb76bfc7c WatchSource:0}: Error finding container ae18e64420dc0822ea2d4d0f17ddb86e23c857afa8c0c33780510c8fb76bfc7c: Status 404 returned error can't find the container with id ae18e64420dc0822ea2d4d0f17ddb86e23c857afa8c0c33780510c8fb76bfc7c Apr 24 17:01:28.137850 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.137827 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 17:01:28.141968 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.141939 2573 generic.go:358] "Generic (PLEG): container finished" podID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerID="29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391" exitCode=0 Apr 24 17:01:28.142104 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.142013 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" Apr 24 17:01:28.142104 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.142018 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerDied","Data":"29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391"} Apr 24 17:01:28.142201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.142111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2" event={"ID":"3b319f88-1f50-429e-ab49-0e2342ca2e35","Type":"ContainerDied","Data":"b6e097e6ab84c50cf30128133c889aa1568b48c90c0e4e1b1f99d66ad74b3913"} Apr 24 17:01:28.142201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.142142 2573 scope.go:117] "RemoveContainer" containerID="bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb" Apr 24 17:01:28.143476 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.143451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerStarted","Data":"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5"} Apr 24 17:01:28.143476 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.143484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerStarted","Data":"ae18e64420dc0822ea2d4d0f17ddb86e23c857afa8c0c33780510c8fb76bfc7c"} Apr 24 17:01:28.150227 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.150198 2573 scope.go:117] "RemoveContainer" containerID="29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391" Apr 24 17:01:28.158765 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.158728 2573 scope.go:117] "RemoveContainer" containerID="db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617" Apr 24 17:01:28.166455 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.166432 2573 scope.go:117] "RemoveContainer" containerID="bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb" Apr 24 17:01:28.166712 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:28.166682 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb\": container with ID starting with bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb not found: ID does not exist" containerID="bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb" Apr 24 17:01:28.166792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.166719 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb"} err="failed to get container status \"bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb\": rpc error: code = NotFound desc = could not find container \"bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb\": container with ID starting with bc9d305718ff5e488461da19b5ad980286f51416495d745014df1ad39d723ccb not found: ID does not exist" Apr 24 17:01:28.166792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.166736 2573 scope.go:117] "RemoveContainer" containerID="29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391" Apr 24 17:01:28.167039 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:28.167004 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391\": container with ID starting with 29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391 not found: ID does not exist" containerID="29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391" Apr 24 17:01:28.167085 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.167049 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391"} err="failed to get container status \"29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391\": rpc error: code = NotFound desc = could not find container \"29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391\": container with ID starting with 29e6b4b48e8a376a0e6d365b3a3f99de2654efba7095bd2051ed857dc74c8391 not found: ID does not exist" Apr 24 17:01:28.167085 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.167072 2573 scope.go:117] "RemoveContainer" containerID="db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617" Apr 24 17:01:28.167338 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:28.167303 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617\": container with ID starting with db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617 not found: ID does not exist" containerID="db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617" Apr 24 17:01:28.167433 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.167343 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617"} err="failed to get container status \"db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617\": rpc error: code = NotFound desc = could not find container \"db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617\": container with ID starting with db690386fe568aa05de42c9311df334ee563cba37f1f346467166c8754431617 not found: ID does not exist" Apr 24 17:01:28.254976 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.254949 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p92f2\" (UniqueName: \"kubernetes.io/projected/3b319f88-1f50-429e-ab49-0e2342ca2e35-kube-api-access-p92f2\") pod \"3b319f88-1f50-429e-ab49-0e2342ca2e35\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " Apr 24 17:01:28.255141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.254981 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3b319f88-1f50-429e-ab49-0e2342ca2e35-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"3b319f88-1f50-429e-ab49-0e2342ca2e35\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " Apr 24 17:01:28.255141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.255006 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b319f88-1f50-429e-ab49-0e2342ca2e35-proxy-tls\") pod \"3b319f88-1f50-429e-ab49-0e2342ca2e35\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " Apr 24 17:01:28.255141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.255028 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b319f88-1f50-429e-ab49-0e2342ca2e35-kserve-provision-location\") pod \"3b319f88-1f50-429e-ab49-0e2342ca2e35\" (UID: \"3b319f88-1f50-429e-ab49-0e2342ca2e35\") " Apr 24 17:01:28.255437 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.255396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b319f88-1f50-429e-ab49-0e2342ca2e35-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "3b319f88-1f50-429e-ab49-0e2342ca2e35" (UID: "3b319f88-1f50-429e-ab49-0e2342ca2e35"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:01:28.255541 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.255451 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b319f88-1f50-429e-ab49-0e2342ca2e35-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3b319f88-1f50-429e-ab49-0e2342ca2e35" (UID: "3b319f88-1f50-429e-ab49-0e2342ca2e35"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:01:28.257223 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.257203 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b319f88-1f50-429e-ab49-0e2342ca2e35-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3b319f88-1f50-429e-ab49-0e2342ca2e35" (UID: "3b319f88-1f50-429e-ab49-0e2342ca2e35"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:01:28.257299 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.257257 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b319f88-1f50-429e-ab49-0e2342ca2e35-kube-api-access-p92f2" (OuterVolumeSpecName: "kube-api-access-p92f2") pod "3b319f88-1f50-429e-ab49-0e2342ca2e35" (UID: "3b319f88-1f50-429e-ab49-0e2342ca2e35"). InnerVolumeSpecName "kube-api-access-p92f2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:01:28.356536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.356501 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p92f2\" (UniqueName: \"kubernetes.io/projected/3b319f88-1f50-429e-ab49-0e2342ca2e35-kube-api-access-p92f2\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:28.356536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.356528 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3b319f88-1f50-429e-ab49-0e2342ca2e35-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:28.356536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.356542 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b319f88-1f50-429e-ab49-0e2342ca2e35-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:28.356755 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.356550 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b319f88-1f50-429e-ab49-0e2342ca2e35-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:28.462965 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.462937 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2"] Apr 24 17:01:28.471450 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:28.471416 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-vbjq2"] Apr 24 17:01:30.217190 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:30.217152 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" path="/var/lib/kubelet/pods/3b319f88-1f50-429e-ab49-0e2342ca2e35/volumes" Apr 24 17:01:32.158007 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:32.157972 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerID="9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5" exitCode=0 Apr 24 17:01:32.158431 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:32.158028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerDied","Data":"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5"} Apr 24 17:01:33.162782 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:33.162744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerStarted","Data":"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6"} Apr 24 17:01:33.162782 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:33.162780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerStarted","Data":"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b"} Apr 24 17:01:33.163213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:33.163103 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:33.163253 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:33.163214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:33.164685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:33.164659 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 17:01:33.180072 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:33.180029 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" podStartSLOduration=6.180015449 podStartE2EDuration="6.180015449s" podCreationTimestamp="2026-04-24 17:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:01:33.178996861 +0000 UTC m=+1351.443712284" watchObservedRunningTime="2026-04-24 17:01:33.180015449 +0000 UTC m=+1351.444730847" Apr 24 17:01:34.165996 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:34.165961 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 17:01:39.169812 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:39.169775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:39.170463 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:39.170423 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 24 17:01:49.170930 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:49.170899 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:57.129488 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.129454 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk"] Apr 24 17:01:57.130071 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.129755 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" containerID="cri-o://8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b" gracePeriod=30 Apr 24 17:01:57.130071 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.129825 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kube-rbac-proxy" containerID="cri-o://f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6" gracePeriod=30 Apr 24 17:01:57.226195 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226158 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l"] Apr 24 17:01:57.226487 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226473 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kube-rbac-proxy" Apr 24 17:01:57.226538 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226489 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kube-rbac-proxy" Apr 24 17:01:57.226538 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226502 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="storage-initializer" Apr 24 17:01:57.226538 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226508 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="storage-initializer" Apr 24 17:01:57.226538 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226524 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kserve-container" Apr 24 17:01:57.226538 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226530 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kserve-container" Apr 24 17:01:57.226685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226573 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kserve-container" Apr 24 17:01:57.226685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.226584 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b319f88-1f50-429e-ab49-0e2342ca2e35" containerName="kube-rbac-proxy" Apr 24 17:01:57.229615 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.229592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.231673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.231650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 24 17:01:57.231956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.231937 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:01:57.242624 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.242568 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l"] Apr 24 17:01:57.289539 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.289497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/280ace04-6192-4551-ab95-e3fc2196f250-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.289539 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.289539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280ace04-6192-4551-ab95-e3fc2196f250-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.289768 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.289568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpv7\" (UniqueName: \"kubernetes.io/projected/280ace04-6192-4551-ab95-e3fc2196f250-kube-api-access-6mpv7\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.289768 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.289629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.390480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.390382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/280ace04-6192-4551-ab95-e3fc2196f250-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.390480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.390449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280ace04-6192-4551-ab95-e3fc2196f250-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.390712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.390502 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpv7\" (UniqueName: \"kubernetes.io/projected/280ace04-6192-4551-ab95-e3fc2196f250-kube-api-access-6mpv7\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.390712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.390583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.390811 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:57.390725 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 24 17:01:57.390811 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:57.390798 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls podName:280ace04-6192-4551-ab95-e3fc2196f250 nodeName:}" failed. No retries permitted until 2026-04-24 17:01:57.890773008 +0000 UTC m=+1376.155488391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" (UID: "280ace04-6192-4551-ab95-e3fc2196f250") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 24 17:01:57.390911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.390864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280ace04-6192-4551-ab95-e3fc2196f250-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.391154 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.391137 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/280ace04-6192-4551-ab95-e3fc2196f250-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.399265 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.399245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpv7\" (UniqueName: \"kubernetes.io/projected/280ace04-6192-4551-ab95-e3fc2196f250-kube-api-access-6mpv7\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.862810 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.862786 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:57.895177 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895140 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld26p\" (UniqueName: \"kubernetes.io/projected/b3745a76-0ca5-4b02-9515-1503bb14983e-kube-api-access-ld26p\") pod \"b3745a76-0ca5-4b02-9515-1503bb14983e\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " Apr 24 17:01:57.895353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895205 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3745a76-0ca5-4b02-9515-1503bb14983e-kserve-provision-location\") pod \"b3745a76-0ca5-4b02-9515-1503bb14983e\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " Apr 24 17:01:57.895353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895247 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3745a76-0ca5-4b02-9515-1503bb14983e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"b3745a76-0ca5-4b02-9515-1503bb14983e\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " Apr 24 17:01:57.895353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895286 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3745a76-0ca5-4b02-9515-1503bb14983e-proxy-tls\") pod \"b3745a76-0ca5-4b02-9515-1503bb14983e\" (UID: \"b3745a76-0ca5-4b02-9515-1503bb14983e\") " Apr 24 17:01:57.895527 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:57.895669 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895637 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3745a76-0ca5-4b02-9515-1503bb14983e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "b3745a76-0ca5-4b02-9515-1503bb14983e" (UID: "b3745a76-0ca5-4b02-9515-1503bb14983e"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:01:57.895745 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:57.895650 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-serving-cert: secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 24 17:01:57.895809 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.895642 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3745a76-0ca5-4b02-9515-1503bb14983e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b3745a76-0ca5-4b02-9515-1503bb14983e" (UID: "b3745a76-0ca5-4b02-9515-1503bb14983e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:01:57.895809 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:57.895747 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls podName:280ace04-6192-4551-ab95-e3fc2196f250 nodeName:}" failed. No retries permitted until 2026-04-24 17:01:58.895727636 +0000 UTC m=+1377.160443012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls") pod "isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" (UID: "280ace04-6192-4551-ab95-e3fc2196f250") : secret "isvc-mlflow-v2-runtime-predictor-serving-cert" not found Apr 24 17:01:57.897536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.897510 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3745a76-0ca5-4b02-9515-1503bb14983e-kube-api-access-ld26p" (OuterVolumeSpecName: "kube-api-access-ld26p") pod "b3745a76-0ca5-4b02-9515-1503bb14983e" (UID: "b3745a76-0ca5-4b02-9515-1503bb14983e"). InnerVolumeSpecName "kube-api-access-ld26p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:01:57.897536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.897518 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3745a76-0ca5-4b02-9515-1503bb14983e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b3745a76-0ca5-4b02-9515-1503bb14983e" (UID: "b3745a76-0ca5-4b02-9515-1503bb14983e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:01:57.996500 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.996459 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3745a76-0ca5-4b02-9515-1503bb14983e-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:57.996500 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.996494 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ld26p\" (UniqueName: \"kubernetes.io/projected/b3745a76-0ca5-4b02-9515-1503bb14983e-kube-api-access-ld26p\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:57.996680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.996509 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3745a76-0ca5-4b02-9515-1503bb14983e-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:57.996680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:57.996523 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b3745a76-0ca5-4b02-9515-1503bb14983e-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:01:58.234111 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234080 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerID="f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6" exitCode=2 Apr 24 17:01:58.234111 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234106 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerID="8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b" exitCode=0 Apr 24 17:01:58.234544 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234159 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" Apr 24 17:01:58.234544 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234167 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerDied","Data":"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6"} Apr 24 17:01:58.234544 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234206 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerDied","Data":"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b"} Apr 24 17:01:58.234544 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk" event={"ID":"b3745a76-0ca5-4b02-9515-1503bb14983e","Type":"ContainerDied","Data":"ae18e64420dc0822ea2d4d0f17ddb86e23c857afa8c0c33780510c8fb76bfc7c"} Apr 24 17:01:58.234544 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.234231 2573 scope.go:117] "RemoveContainer" containerID="f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6" Apr 24 17:01:58.241970 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.241945 2573 scope.go:117] "RemoveContainer" containerID="8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b" Apr 24 17:01:58.248660 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.248640 2573 scope.go:117] "RemoveContainer" containerID="9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5" Apr 24 17:01:58.250792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.250770 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk"] Apr 24 17:01:58.256147 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256125 2573 scope.go:117] "RemoveContainer" containerID="f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6" Apr 24 17:01:58.256411 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256395 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-frxbk"] Apr 24 17:01:58.256502 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:58.256416 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6\": container with ID starting with f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6 not found: ID does not exist" containerID="f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6" Apr 24 17:01:58.256502 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256453 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6"} err="failed to get container status \"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6\": rpc error: code = NotFound desc = could not find container \"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6\": container with ID starting with f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6 not found: ID does not exist" Apr 24 17:01:58.256502 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256481 2573 scope.go:117] "RemoveContainer" containerID="8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b" Apr 24 17:01:58.256728 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:58.256712 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b\": container with ID starting with 8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b not found: ID does not exist" containerID="8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b" Apr 24 17:01:58.256788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256735 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b"} err="failed to get container status \"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b\": rpc error: code = NotFound desc = could not find container \"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b\": container with ID starting with 8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b not found: ID does not exist" Apr 24 17:01:58.256788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256758 2573 scope.go:117] "RemoveContainer" containerID="9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5" Apr 24 17:01:58.256984 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:01:58.256964 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5\": container with ID starting with 9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5 not found: ID does not exist" containerID="9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5" Apr 24 17:01:58.257021 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.256990 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5"} err="failed to get container status \"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5\": rpc error: code = NotFound desc = could not find container \"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5\": container with ID starting with 9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5 not found: ID does not exist" Apr 24 17:01:58.257021 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.257007 2573 scope.go:117] "RemoveContainer" containerID="f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6" Apr 24 17:01:58.257215 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.257196 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6"} err="failed to get container status \"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6\": rpc error: code = NotFound desc = could not find container \"f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6\": container with ID starting with f41e08ff59250f4277d65bb2482718bd6fbdb7070782c714298bd8d03437a5f6 not found: ID does not exist" Apr 24 17:01:58.257254 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.257217 2573 scope.go:117] "RemoveContainer" containerID="8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b" Apr 24 17:01:58.257471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.257453 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b"} err="failed to get container status \"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b\": rpc error: code = NotFound desc = could not find container \"8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b\": container with ID starting with 8de1695d15e99616950bb5c84f5356c2e7e13a293e4ea564172f956bb89f1e5b not found: ID does not exist" Apr 24 17:01:58.257548 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.257475 2573 scope.go:117] "RemoveContainer" containerID="9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5" Apr 24 17:01:58.257679 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.257661 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5"} err="failed to get container status \"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5\": rpc error: code = NotFound desc = could not find container \"9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5\": container with ID starting with 9ad7d34774276dc842ba409d820c28eb24739ba81594185ed7b6b28af43080e5 not found: ID does not exist" Apr 24 17:01:58.905943 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.905896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:58.908455 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:58.908432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:59.039861 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:59.039809 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:01:59.160771 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:59.160593 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l"] Apr 24 17:01:59.163520 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:01:59.163490 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280ace04_6192_4551_ab95_e3fc2196f250.slice/crio-4a775bbefe90a3cd000bbcb8115bc2438bff89671977687ab98aee63a9a38828 WatchSource:0}: Error finding container 4a775bbefe90a3cd000bbcb8115bc2438bff89671977687ab98aee63a9a38828: Status 404 returned error can't find the container with id 4a775bbefe90a3cd000bbcb8115bc2438bff89671977687ab98aee63a9a38828 Apr 24 17:01:59.238684 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:59.238649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerStarted","Data":"ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b"} Apr 24 17:01:59.238684 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:01:59.238685 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerStarted","Data":"4a775bbefe90a3cd000bbcb8115bc2438bff89671977687ab98aee63a9a38828"} Apr 24 17:02:00.220172 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:00.220135 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" path="/var/lib/kubelet/pods/b3745a76-0ca5-4b02-9515-1503bb14983e/volumes" Apr 24 17:02:03.251206 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:03.251166 2573 generic.go:358] "Generic (PLEG): container finished" podID="280ace04-6192-4551-ab95-e3fc2196f250" containerID="ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b" exitCode=0 Apr 24 17:02:03.251654 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:03.251240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerDied","Data":"ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b"} Apr 24 17:02:04.256811 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:04.256769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerStarted","Data":"6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30"} Apr 24 17:02:04.256811 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:04.256810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerStarted","Data":"2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd"} Apr 24 17:02:04.257360 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:04.257069 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:02:04.276800 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:04.276731 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" podStartSLOduration=7.276715087 podStartE2EDuration="7.276715087s" podCreationTimestamp="2026-04-24 17:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:02:04.27537596 +0000 UTC m=+1382.540091355" watchObservedRunningTime="2026-04-24 17:02:04.276715087 +0000 UTC m=+1382.541430485" Apr 24 17:02:05.260399 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:05.260361 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:02:11.268678 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:11.268642 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:02:41.272906 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:41.272868 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:02:47.319352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.319296 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l"] Apr 24 17:02:47.319750 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.319720 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kserve-container" containerID="cri-o://2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd" gracePeriod=30 Apr 24 17:02:47.319808 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.319743 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kube-rbac-proxy" containerID="cri-o://6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30" gracePeriod=30 Apr 24 17:02:47.412993 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.412958 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs"] Apr 24 17:02:47.413236 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413224 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" Apr 24 17:02:47.413281 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413238 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" Apr 24 17:02:47.413281 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413256 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="storage-initializer" Apr 24 17:02:47.413281 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413262 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="storage-initializer" Apr 24 17:02:47.413281 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413269 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kube-rbac-proxy" Apr 24 17:02:47.413281 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413274 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kube-rbac-proxy" Apr 24 17:02:47.413461 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413349 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kserve-container" Apr 24 17:02:47.413461 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.413360 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3745a76-0ca5-4b02-9515-1503bb14983e" containerName="kube-rbac-proxy" Apr 24 17:02:47.416661 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.416638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.419096 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.419072 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 24 17:02:47.419244 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.419220 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 24 17:02:47.428965 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.428941 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs"] Apr 24 17:02:47.510477 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.510439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzj67\" (UniqueName: \"kubernetes.io/projected/c41767ab-b00b-4075-8a45-f38715c0de0a-kube-api-access-pzj67\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.510665 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.510500 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41767ab-b00b-4075-8a45-f38715c0de0a-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.510665 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.510530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c41767ab-b00b-4075-8a45-f38715c0de0a-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.510665 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.510551 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c41767ab-b00b-4075-8a45-f38715c0de0a-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.611778 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.611687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzj67\" (UniqueName: \"kubernetes.io/projected/c41767ab-b00b-4075-8a45-f38715c0de0a-kube-api-access-pzj67\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.611778 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.611746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41767ab-b00b-4075-8a45-f38715c0de0a-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.611778 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.611767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c41767ab-b00b-4075-8a45-f38715c0de0a-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.612061 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.611790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c41767ab-b00b-4075-8a45-f38715c0de0a-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.612194 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.612170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41767ab-b00b-4075-8a45-f38715c0de0a-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.612545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.612516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c41767ab-b00b-4075-8a45-f38715c0de0a-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.614412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.614392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c41767ab-b00b-4075-8a45-f38715c0de0a-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.620442 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.620416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzj67\" (UniqueName: \"kubernetes.io/projected/c41767ab-b00b-4075-8a45-f38715c0de0a-kube-api-access-pzj67\") pod \"isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.729572 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.729530 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:47.853264 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:47.853228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs"] Apr 24 17:02:48.383830 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.383797 2573 generic.go:358] "Generic (PLEG): container finished" podID="280ace04-6192-4551-ab95-e3fc2196f250" containerID="6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30" exitCode=2 Apr 24 17:02:48.384201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.383852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerDied","Data":"6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30"} Apr 24 17:02:48.385594 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.385565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerStarted","Data":"c564c98c2f48ff27a2637acea624ae0669a832aaa39ea0cd2095c3f1fd61afb2"} Apr 24 17:02:48.385698 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.385603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerStarted","Data":"f1004643da1362641d0be4324041d6832672024b11c8607d376a202b89249988"} Apr 24 17:02:48.560212 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.560189 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:02:48.619695 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.619641 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280ace04-6192-4551-ab95-e3fc2196f250-kserve-provision-location\") pod \"280ace04-6192-4551-ab95-e3fc2196f250\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " Apr 24 17:02:48.619849 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.619737 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") pod \"280ace04-6192-4551-ab95-e3fc2196f250\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " Apr 24 17:02:48.619897 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.619852 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mpv7\" (UniqueName: \"kubernetes.io/projected/280ace04-6192-4551-ab95-e3fc2196f250-kube-api-access-6mpv7\") pod \"280ace04-6192-4551-ab95-e3fc2196f250\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " Apr 24 17:02:48.619897 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.619885 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/280ace04-6192-4551-ab95-e3fc2196f250-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"280ace04-6192-4551-ab95-e3fc2196f250\" (UID: \"280ace04-6192-4551-ab95-e3fc2196f250\") " Apr 24 17:02:48.620034 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.620008 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280ace04-6192-4551-ab95-e3fc2196f250-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "280ace04-6192-4551-ab95-e3fc2196f250" (UID: "280ace04-6192-4551-ab95-e3fc2196f250"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:02:48.620138 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.620122 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280ace04-6192-4551-ab95-e3fc2196f250-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:02:48.620256 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.620233 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/280ace04-6192-4551-ab95-e3fc2196f250-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "280ace04-6192-4551-ab95-e3fc2196f250" (UID: "280ace04-6192-4551-ab95-e3fc2196f250"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:02:48.621882 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.621861 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "280ace04-6192-4551-ab95-e3fc2196f250" (UID: "280ace04-6192-4551-ab95-e3fc2196f250"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:02:48.622001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.621977 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280ace04-6192-4551-ab95-e3fc2196f250-kube-api-access-6mpv7" (OuterVolumeSpecName: "kube-api-access-6mpv7") pod "280ace04-6192-4551-ab95-e3fc2196f250" (UID: "280ace04-6192-4551-ab95-e3fc2196f250"). InnerVolumeSpecName "kube-api-access-6mpv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:02:48.721201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.721121 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6mpv7\" (UniqueName: \"kubernetes.io/projected/280ace04-6192-4551-ab95-e3fc2196f250-kube-api-access-6mpv7\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:02:48.721201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.721155 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/280ace04-6192-4551-ab95-e3fc2196f250-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:02:48.721201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:48.721167 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/280ace04-6192-4551-ab95-e3fc2196f250-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:02:49.389901 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.389867 2573 generic.go:358] "Generic (PLEG): container finished" podID="280ace04-6192-4551-ab95-e3fc2196f250" containerID="2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd" exitCode=0 Apr 24 17:02:49.390348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.389945 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" Apr 24 17:02:49.390348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.389955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerDied","Data":"2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd"} Apr 24 17:02:49.390348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.389988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l" event={"ID":"280ace04-6192-4551-ab95-e3fc2196f250","Type":"ContainerDied","Data":"4a775bbefe90a3cd000bbcb8115bc2438bff89671977687ab98aee63a9a38828"} Apr 24 17:02:49.390348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.390005 2573 scope.go:117] "RemoveContainer" containerID="6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30" Apr 24 17:02:49.398177 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.398159 2573 scope.go:117] "RemoveContainer" containerID="2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd" Apr 24 17:02:49.405271 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.405254 2573 scope.go:117] "RemoveContainer" containerID="ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b" Apr 24 17:02:49.412212 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.412186 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l"] Apr 24 17:02:49.412821 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.412807 2573 scope.go:117] "RemoveContainer" containerID="6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30" Apr 24 17:02:49.413063 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:02:49.413044 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30\": container with ID starting with 6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30 not found: ID does not exist" containerID="6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30" Apr 24 17:02:49.413108 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.413073 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30"} err="failed to get container status \"6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30\": rpc error: code = NotFound desc = could not find container \"6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30\": container with ID starting with 6049757d796d935be9ba23571e5563ef055dcbb122ed19cce1a2d101b5379f30 not found: ID does not exist" Apr 24 17:02:49.413108 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.413096 2573 scope.go:117] "RemoveContainer" containerID="2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd" Apr 24 17:02:49.413369 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:02:49.413342 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd\": container with ID starting with 2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd not found: ID does not exist" containerID="2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd" Apr 24 17:02:49.413443 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.413368 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd"} err="failed to get container status \"2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd\": rpc error: code = NotFound desc = could not find container \"2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd\": container with ID starting with 2f18f64e3241e7422f7518c2940a295bc6a3b14508628eea268c6f1271ac23fd not found: ID does not exist" Apr 24 17:02:49.413443 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.413385 2573 scope.go:117] "RemoveContainer" containerID="ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b" Apr 24 17:02:49.413612 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:02:49.413596 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b\": container with ID starting with ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b not found: ID does not exist" containerID="ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b" Apr 24 17:02:49.413652 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.413618 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b"} err="failed to get container status \"ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b\": rpc error: code = NotFound desc = could not find container \"ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b\": container with ID starting with ad43356a556f848283bb78d8d173d3493e87eac63e520f1c232323270e7aaa8b not found: ID does not exist" Apr 24 17:02:49.416172 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:49.416153 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-wvc7l"] Apr 24 17:02:50.216613 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:50.216535 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280ace04-6192-4551-ab95-e3fc2196f250" path="/var/lib/kubelet/pods/280ace04-6192-4551-ab95-e3fc2196f250/volumes" Apr 24 17:02:52.401406 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:52.401373 2573 generic.go:358] "Generic (PLEG): container finished" podID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerID="c564c98c2f48ff27a2637acea624ae0669a832aaa39ea0cd2095c3f1fd61afb2" exitCode=0 Apr 24 17:02:52.401773 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:52.401452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerDied","Data":"c564c98c2f48ff27a2637acea624ae0669a832aaa39ea0cd2095c3f1fd61afb2"} Apr 24 17:02:53.406560 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:53.406516 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerStarted","Data":"5ff747f30e490d916d8ef3c8f58450f06ceff7b8345075d94db654b4092d17ab"} Apr 24 17:02:56.419385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:56.419338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerStarted","Data":"b8a0c4ce650fdb7f89584e412a1037606a6d99682345466b72673a7463de6923"} Apr 24 17:02:56.419385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:56.419382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerStarted","Data":"f746ac8ac2c0d457b648c1246532626e4f7b9d4ca479ee27ec6509e6b7b40179"} Apr 24 17:02:56.419788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:56.419542 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:56.440367 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:56.440289 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podStartSLOduration=6.371593581 podStartE2EDuration="9.44027186s" podCreationTimestamp="2026-04-24 17:02:47 +0000 UTC" firstStartedPulling="2026-04-24 17:02:52.463566549 +0000 UTC m=+1430.728281924" lastFinishedPulling="2026-04-24 17:02:55.532244812 +0000 UTC m=+1433.796960203" observedRunningTime="2026-04-24 17:02:56.439163236 +0000 UTC m=+1434.703878658" watchObservedRunningTime="2026-04-24 17:02:56.44027186 +0000 UTC m=+1434.704987259" Apr 24 17:02:57.424331 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:57.424280 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:02:57.424331 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:02:57.424338 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:03:03.432359 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:03:03.432321 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:03:23.433361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:03:23.433303 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 17:03:33.434644 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:03:33.434612 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:04:02.191527 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:02.191492 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:04:02.192448 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:02.192427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:04:03.435965 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:03.435938 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:04:07.478224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.478175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs"] Apr 24 17:04:07.478655 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.478582 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" containerID="cri-o://5ff747f30e490d916d8ef3c8f58450f06ceff7b8345075d94db654b4092d17ab" gracePeriod=30 Apr 24 17:04:07.478725 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.478641 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-agent" containerID="cri-o://f746ac8ac2c0d457b648c1246532626e4f7b9d4ca479ee27ec6509e6b7b40179" gracePeriod=30 Apr 24 17:04:07.478778 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.478636 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" containerID="cri-o://b8a0c4ce650fdb7f89584e412a1037606a6d99682345466b72673a7463de6923" gracePeriod=30 Apr 24 17:04:07.557821 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.557786 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm"] Apr 24 17:04:07.558086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558074 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kserve-container" Apr 24 17:04:07.558127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558087 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kserve-container" Apr 24 17:04:07.558127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558096 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="storage-initializer" Apr 24 17:04:07.558127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558102 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="storage-initializer" Apr 24 17:04:07.558127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558111 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kube-rbac-proxy" Apr 24 17:04:07.558127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558116 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kube-rbac-proxy" Apr 24 17:04:07.558302 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558165 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kube-rbac-proxy" Apr 24 17:04:07.558302 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.558173 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="280ace04-6192-4551-ab95-e3fc2196f250" containerName="kserve-container" Apr 24 17:04:07.561249 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.561233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.564441 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.564416 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 24 17:04:07.564441 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.564416 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 24 17:04:07.572374 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.572352 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm"] Apr 24 17:04:07.623344 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.623277 2573 generic.go:358] "Generic (PLEG): container finished" podID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerID="b8a0c4ce650fdb7f89584e412a1037606a6d99682345466b72673a7463de6923" exitCode=2 Apr 24 17:04:07.623485 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.623350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerDied","Data":"b8a0c4ce650fdb7f89584e412a1037606a6d99682345466b72673a7463de6923"} Apr 24 17:04:07.661797 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.661764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.661948 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.661799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gz47\" (UniqueName: \"kubernetes.io/projected/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kube-api-access-5gz47\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.661948 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.661869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3658a8fb-b4df-4644-baeb-a7e48285b2ef-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.661948 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.661905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3658a8fb-b4df-4644-baeb-a7e48285b2ef-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.763275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.763223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3658a8fb-b4df-4644-baeb-a7e48285b2ef-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.763275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.763285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3658a8fb-b4df-4644-baeb-a7e48285b2ef-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.763552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.763360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.763552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.763382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gz47\" (UniqueName: \"kubernetes.io/projected/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kube-api-access-5gz47\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.763851 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.763823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.764032 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.764001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3658a8fb-b4df-4644-baeb-a7e48285b2ef-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.765848 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.765828 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3658a8fb-b4df-4644-baeb-a7e48285b2ef-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.775905 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.775885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gz47\" (UniqueName: \"kubernetes.io/projected/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kube-api-access-5gz47\") pod \"isvc-paddle-predictor-6b8b7cfb4b-25mlm\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.871785 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.871754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:07.997807 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:07.997773 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm"] Apr 24 17:04:08.002954 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:04:08.002923 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3658a8fb_b4df_4644_baeb_a7e48285b2ef.slice/crio-b87f031b1e61bbe303a1883c3d2c8fdf7188bbd8492712191eca2f2d4aa751f2 WatchSource:0}: Error finding container b87f031b1e61bbe303a1883c3d2c8fdf7188bbd8492712191eca2f2d4aa751f2: Status 404 returned error can't find the container with id b87f031b1e61bbe303a1883c3d2c8fdf7188bbd8492712191eca2f2d4aa751f2 Apr 24 17:04:08.427606 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:08.427500 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 17:04:08.627554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:08.627515 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerStarted","Data":"749c0a11f6282885a279f9344e9474873b555835dba42dbcce0463c1ee71eccf"} Apr 24 17:04:08.627554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:08.627555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerStarted","Data":"b87f031b1e61bbe303a1883c3d2c8fdf7188bbd8492712191eca2f2d4aa751f2"} Apr 24 17:04:09.632552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:09.632519 2573 generic.go:358] "Generic (PLEG): container finished" podID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerID="5ff747f30e490d916d8ef3c8f58450f06ceff7b8345075d94db654b4092d17ab" exitCode=0 Apr 24 17:04:09.632894 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:09.632602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerDied","Data":"5ff747f30e490d916d8ef3c8f58450f06ceff7b8345075d94db654b4092d17ab"} Apr 24 17:04:12.643673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:12.643637 2573 generic.go:358] "Generic (PLEG): container finished" podID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerID="749c0a11f6282885a279f9344e9474873b555835dba42dbcce0463c1ee71eccf" exitCode=0 Apr 24 17:04:12.644048 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:12.643688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerDied","Data":"749c0a11f6282885a279f9344e9474873b555835dba42dbcce0463c1ee71eccf"} Apr 24 17:04:13.427807 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:13.427762 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 17:04:13.433258 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:13.433231 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 17:04:18.427667 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:18.427589 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 17:04:18.428111 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:18.427773 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:04:23.428140 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:23.428047 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 17:04:23.433574 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:23.433540 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 17:04:24.692261 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:24.692228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerStarted","Data":"0ce003d537be3304cc4e426599826d7958c898213cc892517d2ed69c23ab6541"} Apr 24 17:04:25.696823 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:25.696780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerStarted","Data":"691594a24a820f45a6d49ea124ce79e6e6efb8328a2c904e0289f72977bd5d9b"} Apr 24 17:04:25.697235 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:25.697019 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:25.717160 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:25.717105 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podStartSLOduration=6.75626301 podStartE2EDuration="18.717088184s" podCreationTimestamp="2026-04-24 17:04:07 +0000 UTC" firstStartedPulling="2026-04-24 17:04:12.644851995 +0000 UTC m=+1510.909567372" lastFinishedPulling="2026-04-24 17:04:24.605677161 +0000 UTC m=+1522.870392546" observedRunningTime="2026-04-24 17:04:25.714713444 +0000 UTC m=+1523.979428844" watchObservedRunningTime="2026-04-24 17:04:25.717088184 +0000 UTC m=+1523.981803583" Apr 24 17:04:26.700813 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:26.700775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:26.702012 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:26.701987 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 17:04:27.704402 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:27.704356 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 17:04:28.428027 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:28.427984 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 17:04:32.708902 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:32.708874 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:04:32.709510 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:32.709480 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 17:04:33.428028 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:33.427982 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.30:8643/healthz\": dial tcp 10.134.0.30:8643: connect: connection refused" Apr 24 17:04:33.432998 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:33.432968 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.30:8080: connect: connection refused" Apr 24 17:04:33.433122 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:33.433107 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:04:37.732879 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:37.732799 2573 generic.go:358] "Generic (PLEG): container finished" podID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerID="f746ac8ac2c0d457b648c1246532626e4f7b9d4ca479ee27ec6509e6b7b40179" exitCode=137 Apr 24 17:04:37.733199 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:37.732881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerDied","Data":"f746ac8ac2c0d457b648c1246532626e4f7b9d4ca479ee27ec6509e6b7b40179"} Apr 24 17:04:38.118489 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.118463 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:04:38.231479 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.231445 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c41767ab-b00b-4075-8a45-f38715c0de0a-proxy-tls\") pod \"c41767ab-b00b-4075-8a45-f38715c0de0a\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " Apr 24 17:04:38.231632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.231501 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzj67\" (UniqueName: \"kubernetes.io/projected/c41767ab-b00b-4075-8a45-f38715c0de0a-kube-api-access-pzj67\") pod \"c41767ab-b00b-4075-8a45-f38715c0de0a\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " Apr 24 17:04:38.231632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.231539 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c41767ab-b00b-4075-8a45-f38715c0de0a-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"c41767ab-b00b-4075-8a45-f38715c0de0a\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " Apr 24 17:04:38.231632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.231574 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41767ab-b00b-4075-8a45-f38715c0de0a-kserve-provision-location\") pod \"c41767ab-b00b-4075-8a45-f38715c0de0a\" (UID: \"c41767ab-b00b-4075-8a45-f38715c0de0a\") " Apr 24 17:04:38.231925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.231895 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41767ab-b00b-4075-8a45-f38715c0de0a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c41767ab-b00b-4075-8a45-f38715c0de0a" (UID: "c41767ab-b00b-4075-8a45-f38715c0de0a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:04:38.231997 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.231901 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c41767ab-b00b-4075-8a45-f38715c0de0a-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "c41767ab-b00b-4075-8a45-f38715c0de0a" (UID: "c41767ab-b00b-4075-8a45-f38715c0de0a"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:04:38.233703 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.233670 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41767ab-b00b-4075-8a45-f38715c0de0a-kube-api-access-pzj67" (OuterVolumeSpecName: "kube-api-access-pzj67") pod "c41767ab-b00b-4075-8a45-f38715c0de0a" (UID: "c41767ab-b00b-4075-8a45-f38715c0de0a"). InnerVolumeSpecName "kube-api-access-pzj67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:04:38.233800 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.233732 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41767ab-b00b-4075-8a45-f38715c0de0a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c41767ab-b00b-4075-8a45-f38715c0de0a" (UID: "c41767ab-b00b-4075-8a45-f38715c0de0a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:04:38.332356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.332255 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c41767ab-b00b-4075-8a45-f38715c0de0a-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:04:38.332356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.332284 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzj67\" (UniqueName: \"kubernetes.io/projected/c41767ab-b00b-4075-8a45-f38715c0de0a-kube-api-access-pzj67\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:04:38.332356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.332294 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c41767ab-b00b-4075-8a45-f38715c0de0a-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:04:38.332356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.332324 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c41767ab-b00b-4075-8a45-f38715c0de0a-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:04:38.738041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.737945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" event={"ID":"c41767ab-b00b-4075-8a45-f38715c0de0a","Type":"ContainerDied","Data":"f1004643da1362641d0be4324041d6832672024b11c8607d376a202b89249988"} Apr 24 17:04:38.738041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.738008 2573 scope.go:117] "RemoveContainer" containerID="b8a0c4ce650fdb7f89584e412a1037606a6d99682345466b72673a7463de6923" Apr 24 17:04:38.738041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.738019 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs" Apr 24 17:04:38.746904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.746881 2573 scope.go:117] "RemoveContainer" containerID="f746ac8ac2c0d457b648c1246532626e4f7b9d4ca479ee27ec6509e6b7b40179" Apr 24 17:04:38.753897 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.753880 2573 scope.go:117] "RemoveContainer" containerID="5ff747f30e490d916d8ef3c8f58450f06ceff7b8345075d94db654b4092d17ab" Apr 24 17:04:38.760718 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.760701 2573 scope.go:117] "RemoveContainer" containerID="c564c98c2f48ff27a2637acea624ae0669a832aaa39ea0cd2095c3f1fd61afb2" Apr 24 17:04:38.766166 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.766142 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs"] Apr 24 17:04:38.772458 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:38.772436 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-59bc9cc798-lfwvs"] Apr 24 17:04:40.216318 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:40.216281 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" path="/var/lib/kubelet/pods/c41767ab-b00b-4075-8a45-f38715c0de0a/volumes" Apr 24 17:04:42.710267 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:42.710204 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 17:04:52.709661 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:04:52.709618 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 17:05:02.710414 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:02.710375 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 24 17:05:12.710050 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:12.710021 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:05:19.147152 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.147118 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm"] Apr 24 17:05:19.147663 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.147560 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kube-rbac-proxy" containerID="cri-o://691594a24a820f45a6d49ea124ce79e6e6efb8328a2c904e0289f72977bd5d9b" gracePeriod=30 Apr 24 17:05:19.147663 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.147539 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" containerID="cri-o://0ce003d537be3304cc4e426599826d7958c898213cc892517d2ed69c23ab6541" gracePeriod=30 Apr 24 17:05:19.275483 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275449 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl"] Apr 24 17:05:19.275734 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275722 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" Apr 24 17:05:19.275775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275736 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" Apr 24 17:05:19.275775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275747 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-agent" Apr 24 17:05:19.275775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275753 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-agent" Apr 24 17:05:19.275775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275767 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="storage-initializer" Apr 24 17:05:19.275775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275773 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="storage-initializer" Apr 24 17:05:19.275915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275789 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" Apr 24 17:05:19.275915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275794 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" Apr 24 17:05:19.275915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275836 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-container" Apr 24 17:05:19.275915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275845 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kube-rbac-proxy" Apr 24 17:05:19.275915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.275853 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c41767ab-b00b-4075-8a45-f38715c0de0a" containerName="kserve-agent" Apr 24 17:05:19.278735 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.278717 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.283058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.283032 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:05:19.283359 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.283340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 24 17:05:19.300358 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.295096 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl"] Apr 24 17:05:19.349108 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.349073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ffd6f54-2d24-45a2-a290-63e8bb4020de-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.349273 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.349131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cllk4\" (UniqueName: \"kubernetes.io/projected/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kube-api-access-cllk4\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.349273 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.349251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.349372 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.349296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ffd6f54-2d24-45a2-a290-63e8bb4020de-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.450510 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.450409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cllk4\" (UniqueName: \"kubernetes.io/projected/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kube-api-access-cllk4\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.450510 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.450494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.450510 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.450517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ffd6f54-2d24-45a2-a290-63e8bb4020de-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.450734 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.450551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ffd6f54-2d24-45a2-a290-63e8bb4020de-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.450939 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.450904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.451223 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.451203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ffd6f54-2d24-45a2-a290-63e8bb4020de-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.453115 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.453093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ffd6f54-2d24-45a2-a290-63e8bb4020de-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.459680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.459658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cllk4\" (UniqueName: \"kubernetes.io/projected/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kube-api-access-cllk4\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.589056 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.589012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:19.715680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.715650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl"] Apr 24 17:05:19.718162 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:05:19.718135 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ffd6f54_2d24_45a2_a290_63e8bb4020de.slice/crio-0f7bb6b2762116b14161014e60db4a1f4c3fb993a67d25ae931eeab517eab9c0 WatchSource:0}: Error finding container 0f7bb6b2762116b14161014e60db4a1f4c3fb993a67d25ae931eeab517eab9c0: Status 404 returned error can't find the container with id 0f7bb6b2762116b14161014e60db4a1f4c3fb993a67d25ae931eeab517eab9c0 Apr 24 17:05:19.854683 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.854650 2573 generic.go:358] "Generic (PLEG): container finished" podID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerID="691594a24a820f45a6d49ea124ce79e6e6efb8328a2c904e0289f72977bd5d9b" exitCode=2 Apr 24 17:05:19.854857 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.854734 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerDied","Data":"691594a24a820f45a6d49ea124ce79e6e6efb8328a2c904e0289f72977bd5d9b"} Apr 24 17:05:19.856047 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.856022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerStarted","Data":"7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a"} Apr 24 17:05:19.856163 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:19.856049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerStarted","Data":"0f7bb6b2762116b14161014e60db4a1f4c3fb993a67d25ae931eeab517eab9c0"} Apr 24 17:05:21.864231 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:21.864202 2573 generic.go:358] "Generic (PLEG): container finished" podID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerID="0ce003d537be3304cc4e426599826d7958c898213cc892517d2ed69c23ab6541" exitCode=0 Apr 24 17:05:21.864625 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:21.864291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerDied","Data":"0ce003d537be3304cc4e426599826d7958c898213cc892517d2ed69c23ab6541"} Apr 24 17:05:21.984077 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:21.984052 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:05:22.070092 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.070064 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3658a8fb-b4df-4644-baeb-a7e48285b2ef-proxy-tls\") pod \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " Apr 24 17:05:22.070253 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.070130 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3658a8fb-b4df-4644-baeb-a7e48285b2ef-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " Apr 24 17:05:22.070344 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.070297 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kserve-provision-location\") pod \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " Apr 24 17:05:22.070393 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.070370 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gz47\" (UniqueName: \"kubernetes.io/projected/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kube-api-access-5gz47\") pod \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\" (UID: \"3658a8fb-b4df-4644-baeb-a7e48285b2ef\") " Apr 24 17:05:22.070467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.070446 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3658a8fb-b4df-4644-baeb-a7e48285b2ef-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "3658a8fb-b4df-4644-baeb-a7e48285b2ef" (UID: "3658a8fb-b4df-4644-baeb-a7e48285b2ef"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:05:22.070725 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.070705 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3658a8fb-b4df-4644-baeb-a7e48285b2ef-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:05:22.072342 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.072292 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kube-api-access-5gz47" (OuterVolumeSpecName: "kube-api-access-5gz47") pod "3658a8fb-b4df-4644-baeb-a7e48285b2ef" (UID: "3658a8fb-b4df-4644-baeb-a7e48285b2ef"). InnerVolumeSpecName "kube-api-access-5gz47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:05:22.072499 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.072484 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3658a8fb-b4df-4644-baeb-a7e48285b2ef-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3658a8fb-b4df-4644-baeb-a7e48285b2ef" (UID: "3658a8fb-b4df-4644-baeb-a7e48285b2ef"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:05:22.084245 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.084219 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3658a8fb-b4df-4644-baeb-a7e48285b2ef" (UID: "3658a8fb-b4df-4644-baeb-a7e48285b2ef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:05:22.171864 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.171782 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:05:22.171864 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.171810 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5gz47\" (UniqueName: \"kubernetes.io/projected/3658a8fb-b4df-4644-baeb-a7e48285b2ef-kube-api-access-5gz47\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:05:22.171864 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.171820 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3658a8fb-b4df-4644-baeb-a7e48285b2ef-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:05:22.869637 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.869612 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" Apr 24 17:05:22.870059 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.869603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm" event={"ID":"3658a8fb-b4df-4644-baeb-a7e48285b2ef","Type":"ContainerDied","Data":"b87f031b1e61bbe303a1883c3d2c8fdf7188bbd8492712191eca2f2d4aa751f2"} Apr 24 17:05:22.870059 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.869735 2573 scope.go:117] "RemoveContainer" containerID="691594a24a820f45a6d49ea124ce79e6e6efb8328a2c904e0289f72977bd5d9b" Apr 24 17:05:22.877486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.877435 2573 scope.go:117] "RemoveContainer" containerID="0ce003d537be3304cc4e426599826d7958c898213cc892517d2ed69c23ab6541" Apr 24 17:05:22.884363 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.884346 2573 scope.go:117] "RemoveContainer" containerID="749c0a11f6282885a279f9344e9474873b555835dba42dbcce0463c1ee71eccf" Apr 24 17:05:22.896197 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.896176 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm"] Apr 24 17:05:22.901266 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:22.901245 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-25mlm"] Apr 24 17:05:24.216639 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:24.216563 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" path="/var/lib/kubelet/pods/3658a8fb-b4df-4644-baeb-a7e48285b2ef/volumes" Apr 24 17:05:24.885467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:24.885431 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerID="7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a" exitCode=0 Apr 24 17:05:24.885715 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:24.885499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerDied","Data":"7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a"} Apr 24 17:05:25.891384 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:25.891348 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerStarted","Data":"f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7"} Apr 24 17:05:25.891758 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:25.891392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerStarted","Data":"6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9"} Apr 24 17:05:25.891758 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:25.891603 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:25.916384 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:25.916339 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podStartSLOduration=6.9163032399999995 podStartE2EDuration="6.91630324s" podCreationTimestamp="2026-04-24 17:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:05:25.916166998 +0000 UTC m=+1584.180882394" watchObservedRunningTime="2026-04-24 17:05:25.91630324 +0000 UTC m=+1584.181018637" Apr 24 17:05:26.893720 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:26.893688 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:26.894788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:26.894766 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:05:27.897825 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:27.897785 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:05:32.902354 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:32.902295 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:05:32.902852 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:32.902824 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:05:42.903346 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:42.903296 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:05:52.903030 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:05:52.902936 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:06:02.903760 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:02.903720 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:06:12.904301 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:12.904266 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:06:20.674066 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.674033 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl"] Apr 24 17:06:20.674570 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.674387 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" containerID="cri-o://6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9" gracePeriod=30 Apr 24 17:06:20.674570 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.674414 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kube-rbac-proxy" containerID="cri-o://f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7" gracePeriod=30 Apr 24 17:06:20.777119 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn"] Apr 24 17:06:20.777476 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777451 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" Apr 24 17:06:20.777476 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777470 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" Apr 24 17:06:20.777640 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777489 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="storage-initializer" Apr 24 17:06:20.777640 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777498 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="storage-initializer" Apr 24 17:06:20.777640 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777514 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kube-rbac-proxy" Apr 24 17:06:20.777640 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777520 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kube-rbac-proxy" Apr 24 17:06:20.777640 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777567 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kserve-container" Apr 24 17:06:20.777640 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.777581 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3658a8fb-b4df-4644-baeb-a7e48285b2ef" containerName="kube-rbac-proxy" Apr 24 17:06:20.781895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.781873 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.784024 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.783997 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 24 17:06:20.784140 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.784049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 17:06:20.788962 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.788924 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn"] Apr 24 17:06:20.858333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.858259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ebd39c8-c240-478e-9392-a61f6d7ecef5-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.858333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.858338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtwm\" (UniqueName: \"kubernetes.io/projected/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kube-api-access-kjtwm\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.858585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.858366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.858585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.858385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.959629 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.959540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ebd39c8-c240-478e-9392-a61f6d7ecef5-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.959629 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.959592 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtwm\" (UniqueName: \"kubernetes.io/projected/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kube-api-access-kjtwm\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.959836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.959722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.959836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.959768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.959942 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:06:20.959876 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-serving-cert: secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 24 17:06:20.959993 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:06:20.959949 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls podName:5ebd39c8-c240-478e-9392-a61f6d7ecef5 nodeName:}" failed. No retries permitted until 2026-04-24 17:06:21.459931483 +0000 UTC m=+1639.724646859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls") pod "isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" (UID: "5ebd39c8-c240-478e-9392-a61f6d7ecef5") : secret "isvc-paddle-v2-kserve-predictor-serving-cert" not found Apr 24 17:06:20.960175 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.960152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.960266 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.960247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ebd39c8-c240-478e-9392-a61f6d7ecef5-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:20.968922 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:20.968901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtwm\" (UniqueName: \"kubernetes.io/projected/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kube-api-access-kjtwm\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:21.052195 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.052160 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerID="f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7" exitCode=2 Apr 24 17:06:21.052391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.052236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerDied","Data":"f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7"} Apr 24 17:06:21.463916 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.463875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:21.466478 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.466456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:21.694259 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.694216 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:21.824126 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.824092 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn"] Apr 24 17:06:21.828563 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:06:21.828534 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ebd39c8_c240_478e_9392_a61f6d7ecef5.slice/crio-fc01c900facbf019cc25b5780bb9b39c4742bd9f9c735487f970495e1e7be98a WatchSource:0}: Error finding container fc01c900facbf019cc25b5780bb9b39c4742bd9f9c735487f970495e1e7be98a: Status 404 returned error can't find the container with id fc01c900facbf019cc25b5780bb9b39c4742bd9f9c735487f970495e1e7be98a Apr 24 17:06:21.831029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:21.831006 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:06:22.056429 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:22.056342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerStarted","Data":"f9a6cc72cf2cfeb21bb1620ab7caf24c59130b41112e9b693c3f1c074dd7194c"} Apr 24 17:06:22.056429 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:22.056383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerStarted","Data":"fc01c900facbf019cc25b5780bb9b39c4742bd9f9c735487f970495e1e7be98a"} Apr 24 17:06:22.898451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:22.898409 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.32:8643/healthz\": dial tcp 10.134.0.32:8643: connect: connection refused" Apr 24 17:06:22.902771 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:22.902738 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 24 17:06:23.514137 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.514113 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:06:23.581163 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.581123 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ffd6f54-2d24-45a2-a290-63e8bb4020de-proxy-tls\") pod \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " Apr 24 17:06:23.581384 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.581196 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ffd6f54-2d24-45a2-a290-63e8bb4020de-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " Apr 24 17:06:23.581384 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.581251 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cllk4\" (UniqueName: \"kubernetes.io/projected/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kube-api-access-cllk4\") pod \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " Apr 24 17:06:23.581384 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.581284 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kserve-provision-location\") pod \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\" (UID: \"1ffd6f54-2d24-45a2-a290-63e8bb4020de\") " Apr 24 17:06:23.581659 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.581623 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ffd6f54-2d24-45a2-a290-63e8bb4020de-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "1ffd6f54-2d24-45a2-a290-63e8bb4020de" (UID: "1ffd6f54-2d24-45a2-a290-63e8bb4020de"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:06:23.583564 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.583528 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kube-api-access-cllk4" (OuterVolumeSpecName: "kube-api-access-cllk4") pod "1ffd6f54-2d24-45a2-a290-63e8bb4020de" (UID: "1ffd6f54-2d24-45a2-a290-63e8bb4020de"). InnerVolumeSpecName "kube-api-access-cllk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:06:23.583664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.583586 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ffd6f54-2d24-45a2-a290-63e8bb4020de-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1ffd6f54-2d24-45a2-a290-63e8bb4020de" (UID: "1ffd6f54-2d24-45a2-a290-63e8bb4020de"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:06:23.591383 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.591323 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ffd6f54-2d24-45a2-a290-63e8bb4020de" (UID: "1ffd6f54-2d24-45a2-a290-63e8bb4020de"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:06:23.682674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.682595 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ffd6f54-2d24-45a2-a290-63e8bb4020de-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:06:23.682674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.682622 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ffd6f54-2d24-45a2-a290-63e8bb4020de-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:06:23.682674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.682633 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cllk4\" (UniqueName: \"kubernetes.io/projected/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kube-api-access-cllk4\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:06:23.682674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:23.682644 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ffd6f54-2d24-45a2-a290-63e8bb4020de-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:06:24.064239 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.064194 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerID="6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9" exitCode=0 Apr 24 17:06:24.064239 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.064244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerDied","Data":"6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9"} Apr 24 17:06:24.064797 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.064267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" event={"ID":"1ffd6f54-2d24-45a2-a290-63e8bb4020de","Type":"ContainerDied","Data":"0f7bb6b2762116b14161014e60db4a1f4c3fb993a67d25ae931eeab517eab9c0"} Apr 24 17:06:24.064797 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.064282 2573 scope.go:117] "RemoveContainer" containerID="f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7" Apr 24 17:06:24.064797 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.064338 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl" Apr 24 17:06:24.072384 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.072361 2573 scope.go:117] "RemoveContainer" containerID="6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9" Apr 24 17:06:24.079824 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.079804 2573 scope.go:117] "RemoveContainer" containerID="7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a" Apr 24 17:06:24.086327 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.086284 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl"] Apr 24 17:06:24.087403 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.087381 2573 scope.go:117] "RemoveContainer" containerID="f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7" Apr 24 17:06:24.087677 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:06:24.087659 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7\": container with ID starting with f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7 not found: ID does not exist" containerID="f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7" Apr 24 17:06:24.087736 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.087685 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7"} err="failed to get container status \"f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7\": rpc error: code = NotFound desc = could not find container \"f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7\": container with ID starting with f624d5c6ef9f5b20f6ed4ff269543fc162b02c09667cfd4cf7b728d10f25faf7 not found: ID does not exist" Apr 24 17:06:24.087736 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.087702 2573 scope.go:117] "RemoveContainer" containerID="6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9" Apr 24 17:06:24.087911 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:06:24.087894 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9\": container with ID starting with 6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9 not found: ID does not exist" containerID="6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9" Apr 24 17:06:24.087952 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.087914 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9"} err="failed to get container status \"6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9\": rpc error: code = NotFound desc = could not find container \"6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9\": container with ID starting with 6ad3ce0f3d74e7867efd014f4d0b9814bd3f51715f8684765570ced9eb4e72f9 not found: ID does not exist" Apr 24 17:06:24.087952 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.087928 2573 scope.go:117] "RemoveContainer" containerID="7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a" Apr 24 17:06:24.088136 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:06:24.088116 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a\": container with ID starting with 7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a not found: ID does not exist" containerID="7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a" Apr 24 17:06:24.088178 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.088142 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a"} err="failed to get container status \"7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a\": rpc error: code = NotFound desc = could not find container \"7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a\": container with ID starting with 7668618755e9b3ec6908043c097f1a4c209fb1b1faf94a3b266c6156f470058a not found: ID does not exist" Apr 24 17:06:24.088774 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.088755 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-8kmbl"] Apr 24 17:06:24.216724 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:24.216689 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" path="/var/lib/kubelet/pods/1ffd6f54-2d24-45a2-a290-63e8bb4020de/volumes" Apr 24 17:06:27.074321 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:27.074280 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerID="f9a6cc72cf2cfeb21bb1620ab7caf24c59130b41112e9b693c3f1c074dd7194c" exitCode=0 Apr 24 17:06:27.074667 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:27.074342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerDied","Data":"f9a6cc72cf2cfeb21bb1620ab7caf24c59130b41112e9b693c3f1c074dd7194c"} Apr 24 17:06:28.078493 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:28.078457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerStarted","Data":"65af4152acfa8fc755fcd1180901229ff5a900a09c91ddd5b0d55dafe450fff0"} Apr 24 17:06:28.078493 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:28.078495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerStarted","Data":"7a1a30f1c0d0ec259a703669eb8ee8ceda81942a39235068d01e9e6487843fff"} Apr 24 17:06:28.078891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:28.078714 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:28.097427 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:28.097379 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podStartSLOduration=8.097365562 podStartE2EDuration="8.097365562s" podCreationTimestamp="2026-04-24 17:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:06:28.09643283 +0000 UTC m=+1646.361148227" watchObservedRunningTime="2026-04-24 17:06:28.097365562 +0000 UTC m=+1646.362080960" Apr 24 17:06:29.082502 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:29.082461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:29.083699 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:29.083669 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:06:30.086213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:30.086167 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:06:35.091411 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:35.091383 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:06:35.091854 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:35.091828 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:06:45.092690 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:45.092650 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:06:55.092001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:06:55.091959 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:07:05.092468 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:05.092417 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:07:15.093197 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:15.093168 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:07:22.481254 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.481169 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn"] Apr 24 17:07:22.481646 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.481608 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" containerID="cri-o://7a1a30f1c0d0ec259a703669eb8ee8ceda81942a39235068d01e9e6487843fff" gracePeriod=30 Apr 24 17:07:22.481708 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.481648 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kube-rbac-proxy" containerID="cri-o://65af4152acfa8fc755fcd1180901229ff5a900a09c91ddd5b0d55dafe450fff0" gracePeriod=30 Apr 24 17:07:22.592328 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592270 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt"] Apr 24 17:07:22.592588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592575 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" Apr 24 17:07:22.592632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592590 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" Apr 24 17:07:22.592632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592601 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="storage-initializer" Apr 24 17:07:22.592632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592607 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="storage-initializer" Apr 24 17:07:22.592632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592613 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kube-rbac-proxy" Apr 24 17:07:22.592632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592618 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kube-rbac-proxy" Apr 24 17:07:22.592779 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592667 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kube-rbac-proxy" Apr 24 17:07:22.592779 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.592675 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ffd6f54-2d24-45a2-a290-63e8bb4020de" containerName="kserve-container" Apr 24 17:07:22.594909 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.594889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.597179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.597153 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 24 17:07:22.597296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.597245 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 24 17:07:22.607149 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.607124 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt"] Apr 24 17:07:22.666991 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.666960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.667147 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.667006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.667147 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.667060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2hm\" (UniqueName: \"kubernetes.io/projected/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kube-api-access-7x2hm\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.667147 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.667095 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.767710 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.767681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2hm\" (UniqueName: \"kubernetes.io/projected/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kube-api-access-7x2hm\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.767894 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.767724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.767894 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.767767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.768017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.767927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.768138 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.768119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.768553 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.768535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.770288 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.770264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.775730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.775704 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2hm\" (UniqueName: \"kubernetes.io/projected/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kube-api-access-7x2hm\") pod \"isvc-pmml-predictor-8bb578669-7phwt\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:22.904697 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:22.904654 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:23.025131 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:23.025103 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt"] Apr 24 17:07:23.027924 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:07:23.027893 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d65d7b6_99b0_4a06_a3a8_1f113aa6b4c3.slice/crio-bebd0f4dd9046bccfc0dd1441aa13235704be08cc8885d4998999b8c2db2c024 WatchSource:0}: Error finding container bebd0f4dd9046bccfc0dd1441aa13235704be08cc8885d4998999b8c2db2c024: Status 404 returned error can't find the container with id bebd0f4dd9046bccfc0dd1441aa13235704be08cc8885d4998999b8c2db2c024 Apr 24 17:07:23.239360 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:23.239301 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerID="65af4152acfa8fc755fcd1180901229ff5a900a09c91ddd5b0d55dafe450fff0" exitCode=2 Apr 24 17:07:23.239547 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:23.239389 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerDied","Data":"65af4152acfa8fc755fcd1180901229ff5a900a09c91ddd5b0d55dafe450fff0"} Apr 24 17:07:23.240898 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:23.240869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerStarted","Data":"399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7"} Apr 24 17:07:23.241011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:23.240902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerStarted","Data":"bebd0f4dd9046bccfc0dd1441aa13235704be08cc8885d4998999b8c2db2c024"} Apr 24 17:07:25.087045 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.087006 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 24 17:07:25.092031 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.092004 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 24 17:07:25.248195 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.248167 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerID="7a1a30f1c0d0ec259a703669eb8ee8ceda81942a39235068d01e9e6487843fff" exitCode=0 Apr 24 17:07:25.248379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.248235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerDied","Data":"7a1a30f1c0d0ec259a703669eb8ee8ceda81942a39235068d01e9e6487843fff"} Apr 24 17:07:25.318907 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.318887 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:07:25.391629 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.391549 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls\") pod \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " Apr 24 17:07:25.391629 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.391605 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kserve-provision-location\") pod \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " Apr 24 17:07:25.391813 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.391644 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ebd39c8-c240-478e-9392-a61f6d7ecef5-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " Apr 24 17:07:25.391813 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.391671 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjtwm\" (UniqueName: \"kubernetes.io/projected/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kube-api-access-kjtwm\") pod \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\" (UID: \"5ebd39c8-c240-478e-9392-a61f6d7ecef5\") " Apr 24 17:07:25.392017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.391994 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ebd39c8-c240-478e-9392-a61f6d7ecef5-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "5ebd39c8-c240-478e-9392-a61f6d7ecef5" (UID: "5ebd39c8-c240-478e-9392-a61f6d7ecef5"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:07:25.393678 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.393652 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5ebd39c8-c240-478e-9392-a61f6d7ecef5" (UID: "5ebd39c8-c240-478e-9392-a61f6d7ecef5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:07:25.393985 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.393959 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kube-api-access-kjtwm" (OuterVolumeSpecName: "kube-api-access-kjtwm") pod "5ebd39c8-c240-478e-9392-a61f6d7ecef5" (UID: "5ebd39c8-c240-478e-9392-a61f6d7ecef5"). InnerVolumeSpecName "kube-api-access-kjtwm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:07:25.401766 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.401742 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ebd39c8-c240-478e-9392-a61f6d7ecef5" (UID: "5ebd39c8-c240-478e-9392-a61f6d7ecef5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:07:25.492676 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.492646 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:07:25.492676 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.492673 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ebd39c8-c240-478e-9392-a61f6d7ecef5-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:07:25.492839 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.492694 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kjtwm\" (UniqueName: \"kubernetes.io/projected/5ebd39c8-c240-478e-9392-a61f6d7ecef5-kube-api-access-kjtwm\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:07:25.492839 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:25.492704 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ebd39c8-c240-478e-9392-a61f6d7ecef5-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:07:26.252499 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.252459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" event={"ID":"5ebd39c8-c240-478e-9392-a61f6d7ecef5","Type":"ContainerDied","Data":"fc01c900facbf019cc25b5780bb9b39c4742bd9f9c735487f970495e1e7be98a"} Apr 24 17:07:26.252899 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.252512 2573 scope.go:117] "RemoveContainer" containerID="65af4152acfa8fc755fcd1180901229ff5a900a09c91ddd5b0d55dafe450fff0" Apr 24 17:07:26.252899 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.252551 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn" Apr 24 17:07:26.260111 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.260087 2573 scope.go:117] "RemoveContainer" containerID="7a1a30f1c0d0ec259a703669eb8ee8ceda81942a39235068d01e9e6487843fff" Apr 24 17:07:26.267127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.267112 2573 scope.go:117] "RemoveContainer" containerID="f9a6cc72cf2cfeb21bb1620ab7caf24c59130b41112e9b693c3f1c074dd7194c" Apr 24 17:07:26.271592 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.271568 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn"] Apr 24 17:07:26.274528 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:26.274507 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-xjhpn"] Apr 24 17:07:27.257936 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:27.257904 2573 generic.go:358] "Generic (PLEG): container finished" podID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerID="399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7" exitCode=0 Apr 24 17:07:27.258422 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:27.257947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerDied","Data":"399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7"} Apr 24 17:07:28.217019 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:28.216980 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" path="/var/lib/kubelet/pods/5ebd39c8-c240-478e-9392-a61f6d7ecef5/volumes" Apr 24 17:07:35.285807 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:35.285773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerStarted","Data":"e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f"} Apr 24 17:07:35.285807 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:35.285813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerStarted","Data":"4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc"} Apr 24 17:07:35.286352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:35.286021 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:35.314011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:35.313959 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podStartSLOduration=5.711207012 podStartE2EDuration="13.313944554s" podCreationTimestamp="2026-04-24 17:07:22 +0000 UTC" firstStartedPulling="2026-04-24 17:07:27.259164713 +0000 UTC m=+1705.523880090" lastFinishedPulling="2026-04-24 17:07:34.861902254 +0000 UTC m=+1713.126617632" observedRunningTime="2026-04-24 17:07:35.312854268 +0000 UTC m=+1713.577569682" watchObservedRunningTime="2026-04-24 17:07:35.313944554 +0000 UTC m=+1713.578659951" Apr 24 17:07:36.289284 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:36.289235 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:36.290558 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:36.290526 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:07:37.291509 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:37.291475 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:07:42.297704 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:42.297668 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:07:42.298264 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:42.298234 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:07:52.298823 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:07:52.298781 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:08:02.299119 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:02.299084 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:08:12.298641 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:12.298596 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:08:22.298500 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:22.298459 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:08:32.299105 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:32.299064 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:08:42.298781 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:42.298738 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 24 17:08:52.298949 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:52.298863 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:08:53.691872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.691833 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt"] Apr 24 17:08:53.692251 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.692159 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" containerID="cri-o://4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc" gracePeriod=30 Apr 24 17:08:53.692251 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.692202 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kube-rbac-proxy" containerID="cri-o://e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f" gracePeriod=30 Apr 24 17:08:53.796897 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.796862 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq"] Apr 24 17:08:53.797233 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797208 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="storage-initializer" Apr 24 17:08:53.797233 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797229 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="storage-initializer" Apr 24 17:08:53.797233 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797247 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" Apr 24 17:08:53.797233 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797253 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" Apr 24 17:08:53.797515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797265 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kube-rbac-proxy" Apr 24 17:08:53.797515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797273 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kube-rbac-proxy" Apr 24 17:08:53.797515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797377 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kserve-container" Apr 24 17:08:53.797515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.797393 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ebd39c8-c240-478e-9392-a61f6d7ecef5" containerName="kube-rbac-proxy" Apr 24 17:08:53.800743 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.800719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:53.803146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.803121 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:08:53.803454 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.803439 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 24 17:08:53.812251 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.812217 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq"] Apr 24 17:08:53.984654 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.984553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1cf9f61-843b-4d9e-855b-aacf98c500cf-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:53.984654 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.984600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1cf9f61-843b-4d9e-855b-aacf98c500cf-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:53.984654 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.984619 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:53.984881 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:53.984701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4cc\" (UniqueName: \"kubernetes.io/projected/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kube-api-access-mx4cc\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.085934 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.085882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1cf9f61-843b-4d9e-855b-aacf98c500cf-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.085934 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.085939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1cf9f61-843b-4d9e-855b-aacf98c500cf-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.086176 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.085963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.086176 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.086014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4cc\" (UniqueName: \"kubernetes.io/projected/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kube-api-access-mx4cc\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.086514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.086491 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.086775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.086757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1cf9f61-843b-4d9e-855b-aacf98c500cf-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.088653 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.088636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1cf9f61-843b-4d9e-855b-aacf98c500cf-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.094778 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.094748 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4cc\" (UniqueName: \"kubernetes.io/projected/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kube-api-access-mx4cc\") pod \"isvc-pmml-runtime-predictor-67bc544947-t8nzq\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.111254 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.111212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:54.242323 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.242282 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq"] Apr 24 17:08:54.244885 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:08:54.244847 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1cf9f61_843b_4d9e_855b_aacf98c500cf.slice/crio-c355434cb4cf8d0ab88bb04c161e1cf27e1db7b7ad01136e1b982e9a8b5e053c WatchSource:0}: Error finding container c355434cb4cf8d0ab88bb04c161e1cf27e1db7b7ad01136e1b982e9a8b5e053c: Status 404 returned error can't find the container with id c355434cb4cf8d0ab88bb04c161e1cf27e1db7b7ad01136e1b982e9a8b5e053c Apr 24 17:08:54.508160 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.508063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerStarted","Data":"0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624"} Apr 24 17:08:54.508160 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.508108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerStarted","Data":"c355434cb4cf8d0ab88bb04c161e1cf27e1db7b7ad01136e1b982e9a8b5e053c"} Apr 24 17:08:54.510087 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.510057 2573 generic.go:358] "Generic (PLEG): container finished" podID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerID="e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f" exitCode=2 Apr 24 17:08:54.510229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:54.510128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerDied","Data":"e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f"} Apr 24 17:08:57.292234 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.292188 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 24 17:08:57.747082 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.747060 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:08:57.920793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.920679 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kserve-provision-location\") pod \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " Apr 24 17:08:57.920793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.920780 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " Apr 24 17:08:57.921023 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.920843 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x2hm\" (UniqueName: \"kubernetes.io/projected/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kube-api-access-7x2hm\") pod \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " Apr 24 17:08:57.921023 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.920877 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-proxy-tls\") pod \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\" (UID: \"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3\") " Apr 24 17:08:57.921128 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.921065 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" (UID: "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:08:57.921194 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.921163 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" (UID: "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:08:57.923098 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.923069 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" (UID: "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:08:57.923206 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:57.923103 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kube-api-access-7x2hm" (OuterVolumeSpecName: "kube-api-access-7x2hm") pod "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" (UID: "8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3"). InnerVolumeSpecName "kube-api-access-7x2hm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:08:58.022297 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.022263 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:08:58.022297 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.022294 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7x2hm\" (UniqueName: \"kubernetes.io/projected/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kube-api-access-7x2hm\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:08:58.022516 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.022329 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:08:58.022516 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.022340 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:08:58.522819 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.522783 2573 generic.go:358] "Generic (PLEG): container finished" podID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerID="0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624" exitCode=0 Apr 24 17:08:58.523179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.522860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerDied","Data":"0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624"} Apr 24 17:08:58.524646 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.524623 2573 generic.go:358] "Generic (PLEG): container finished" podID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerID="4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc" exitCode=0 Apr 24 17:08:58.524744 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.524687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerDied","Data":"4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc"} Apr 24 17:08:58.524744 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.524694 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" Apr 24 17:08:58.524744 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.524714 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt" event={"ID":"8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3","Type":"ContainerDied","Data":"bebd0f4dd9046bccfc0dd1441aa13235704be08cc8885d4998999b8c2db2c024"} Apr 24 17:08:58.524744 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.524730 2573 scope.go:117] "RemoveContainer" containerID="e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f" Apr 24 17:08:58.536779 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.536757 2573 scope.go:117] "RemoveContainer" containerID="4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc" Apr 24 17:08:58.547452 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.547430 2573 scope.go:117] "RemoveContainer" containerID="399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7" Apr 24 17:08:58.557254 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.557233 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt"] Apr 24 17:08:58.561389 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.561364 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-7phwt"] Apr 24 17:08:58.564791 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.564769 2573 scope.go:117] "RemoveContainer" containerID="e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f" Apr 24 17:08:58.565173 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:08:58.565149 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f\": container with ID starting with e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f not found: ID does not exist" containerID="e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f" Apr 24 17:08:58.565275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.565180 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f"} err="failed to get container status \"e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f\": rpc error: code = NotFound desc = could not find container \"e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f\": container with ID starting with e5f81ada5e528f478d6a5a5baa4a2be8420ad53a71d10ba19a67e48490a8733f not found: ID does not exist" Apr 24 17:08:58.565275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.565202 2573 scope.go:117] "RemoveContainer" containerID="4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc" Apr 24 17:08:58.565546 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:08:58.565508 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc\": container with ID starting with 4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc not found: ID does not exist" containerID="4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc" Apr 24 17:08:58.565649 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.565544 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc"} err="failed to get container status \"4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc\": rpc error: code = NotFound desc = could not find container \"4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc\": container with ID starting with 4621a0a36b00c88322520f99fda4bb9612824fee4aa882f306c5e2e4627e00bc not found: ID does not exist" Apr 24 17:08:58.565649 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.565568 2573 scope.go:117] "RemoveContainer" containerID="399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7" Apr 24 17:08:58.565875 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:08:58.565850 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7\": container with ID starting with 399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7 not found: ID does not exist" containerID="399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7" Apr 24 17:08:58.565951 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:58.565875 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7"} err="failed to get container status \"399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7\": rpc error: code = NotFound desc = could not find container \"399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7\": container with ID starting with 399a5d47ee47ab8023ed15047c895a5a247d28a43b97ee56ae52b999c9286ee7 not found: ID does not exist" Apr 24 17:08:59.530162 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:59.530118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerStarted","Data":"796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d"} Apr 24 17:08:59.530626 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:59.530174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerStarted","Data":"8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8"} Apr 24 17:08:59.530626 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:59.530503 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:59.530747 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:59.530649 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:08:59.531934 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:59.531910 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:08:59.554984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:08:59.554928 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podStartSLOduration=6.554913814 podStartE2EDuration="6.554913814s" podCreationTimestamp="2026-04-24 17:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:08:59.553713072 +0000 UTC m=+1797.818428474" watchObservedRunningTime="2026-04-24 17:08:59.554913814 +0000 UTC m=+1797.819629212" Apr 24 17:09:00.216880 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:00.216843 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" path="/var/lib/kubelet/pods/8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3/volumes" Apr 24 17:09:00.534427 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:00.534385 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:09:02.217191 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:02.217164 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:09:02.219003 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:02.218978 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:09:05.538792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:05.538759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:09:05.539405 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:05.539376 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:09:15.539902 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:15.539859 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:09:25.540007 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:25.539965 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:09:35.539603 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:35.539546 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:09:45.539788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:45.539745 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:09:55.540356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:09:55.540294 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:10:05.539584 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:05.539537 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:10:07.212585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:07.212541 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:10:17.212693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:17.212641 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:10:27.213230 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:27.213152 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:10:35.203840 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.203809 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq"] Apr 24 17:10:35.204218 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.204136 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" containerID="cri-o://8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8" gracePeriod=30 Apr 24 17:10:35.204282 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.204187 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kube-rbac-proxy" containerID="cri-o://796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d" gracePeriod=30 Apr 24 17:10:35.425090 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425055 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p"] Apr 24 17:10:35.425373 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425359 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="storage-initializer" Apr 24 17:10:35.425373 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425374 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="storage-initializer" Apr 24 17:10:35.425517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425393 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" Apr 24 17:10:35.425517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425400 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" Apr 24 17:10:35.425517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425412 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kube-rbac-proxy" Apr 24 17:10:35.425517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425417 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kube-rbac-proxy" Apr 24 17:10:35.425517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425466 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kserve-container" Apr 24 17:10:35.425517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.425474 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d65d7b6-99b0-4a06-a3a8-1f113aa6b4c3" containerName="kube-rbac-proxy" Apr 24 17:10:35.428565 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.428543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.430706 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.430684 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 17:10:35.430706 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.430694 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 24 17:10:35.439106 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.439081 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p"] Apr 24 17:10:35.524128 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.524096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.524292 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.524140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cfdj\" (UniqueName: \"kubernetes.io/projected/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kube-api-access-4cfdj\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.524292 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.524170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.524292 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.524197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.535323 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.535278 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 24 17:10:35.624910 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.624871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.625069 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.624914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.625069 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.624985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.625069 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.625029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cfdj\" (UniqueName: \"kubernetes.io/projected/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kube-api-access-4cfdj\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.625273 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:10:35.625146 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 24 17:10:35.625273 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:10:35.625206 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls podName:5dcaeb88-b6e5-4461-8ab3-7075914b3eaf nodeName:}" failed. No retries permitted until 2026-04-24 17:10:36.125187106 +0000 UTC m=+1894.389902485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" (UID: "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 24 17:10:35.625399 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.625267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.625690 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.625666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.635192 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.635172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cfdj\" (UniqueName: \"kubernetes.io/projected/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kube-api-access-4cfdj\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:35.820664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.820579 2573 generic.go:358] "Generic (PLEG): container finished" podID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerID="796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d" exitCode=2 Apr 24 17:10:35.820795 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:35.820658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerDied","Data":"796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d"} Apr 24 17:10:36.129331 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:36.129218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:36.131867 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:36.131838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:36.340486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:36.340445 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:36.472630 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:36.472595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p"] Apr 24 17:10:36.476524 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:10:36.476492 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dcaeb88_b6e5_4461_8ab3_7075914b3eaf.slice/crio-50211578428a0a04064efaf3e2492366e02d2476e9878f7c90646cda8f739057 WatchSource:0}: Error finding container 50211578428a0a04064efaf3e2492366e02d2476e9878f7c90646cda8f739057: Status 404 returned error can't find the container with id 50211578428a0a04064efaf3e2492366e02d2476e9878f7c90646cda8f739057 Apr 24 17:10:36.825035 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:36.824995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerStarted","Data":"004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b"} Apr 24 17:10:36.825035 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:36.825041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerStarted","Data":"50211578428a0a04064efaf3e2492366e02d2476e9878f7c90646cda8f739057"} Apr 24 17:10:37.213407 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:37.213288 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 24 17:10:38.752216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.752192 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:10:38.832746 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.832656 2573 generic.go:358] "Generic (PLEG): container finished" podID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerID="8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8" exitCode=0 Apr 24 17:10:38.832886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.832749 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" Apr 24 17:10:38.832886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.832746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerDied","Data":"8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8"} Apr 24 17:10:38.832886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.832861 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq" event={"ID":"f1cf9f61-843b-4d9e-855b-aacf98c500cf","Type":"ContainerDied","Data":"c355434cb4cf8d0ab88bb04c161e1cf27e1db7b7ad01136e1b982e9a8b5e053c"} Apr 24 17:10:38.832886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.832884 2573 scope.go:117] "RemoveContainer" containerID="796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d" Apr 24 17:10:38.840814 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.840798 2573 scope.go:117] "RemoveContainer" containerID="8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8" Apr 24 17:10:38.847662 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.847645 2573 scope.go:117] "RemoveContainer" containerID="0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624" Apr 24 17:10:38.852804 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.852780 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx4cc\" (UniqueName: \"kubernetes.io/projected/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kube-api-access-mx4cc\") pod \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " Apr 24 17:10:38.852886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.852835 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kserve-provision-location\") pod \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " Apr 24 17:10:38.852886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.852876 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1cf9f61-843b-4d9e-855b-aacf98c500cf-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " Apr 24 17:10:38.852970 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.852921 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1cf9f61-843b-4d9e-855b-aacf98c500cf-proxy-tls\") pod \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\" (UID: \"f1cf9f61-843b-4d9e-855b-aacf98c500cf\") " Apr 24 17:10:38.853228 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.853197 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1cf9f61-843b-4d9e-855b-aacf98c500cf" (UID: "f1cf9f61-843b-4d9e-855b-aacf98c500cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:10:38.853298 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.853202 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1cf9f61-843b-4d9e-855b-aacf98c500cf-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "f1cf9f61-843b-4d9e-855b-aacf98c500cf" (UID: "f1cf9f61-843b-4d9e-855b-aacf98c500cf"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:10:38.854857 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.854835 2573 scope.go:117] "RemoveContainer" containerID="796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d" Apr 24 17:10:38.855041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855015 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kube-api-access-mx4cc" (OuterVolumeSpecName: "kube-api-access-mx4cc") pod "f1cf9f61-843b-4d9e-855b-aacf98c500cf" (UID: "f1cf9f61-843b-4d9e-855b-aacf98c500cf"). InnerVolumeSpecName "kube-api-access-mx4cc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:10:38.855132 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855047 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cf9f61-843b-4d9e-855b-aacf98c500cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f1cf9f61-843b-4d9e-855b-aacf98c500cf" (UID: "f1cf9f61-843b-4d9e-855b-aacf98c500cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:10:38.855181 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:10:38.855123 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d\": container with ID starting with 796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d not found: ID does not exist" containerID="796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d" Apr 24 17:10:38.855181 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855153 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d"} err="failed to get container status \"796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d\": rpc error: code = NotFound desc = could not find container \"796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d\": container with ID starting with 796173baf82071d74553634cca12d24e8abce631580bfc4c7b4d096fb676b88d not found: ID does not exist" Apr 24 17:10:38.855181 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855170 2573 scope.go:117] "RemoveContainer" containerID="8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8" Apr 24 17:10:38.855489 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:10:38.855468 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8\": container with ID starting with 8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8 not found: ID does not exist" containerID="8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8" Apr 24 17:10:38.855546 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855494 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8"} err="failed to get container status \"8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8\": rpc error: code = NotFound desc = could not find container \"8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8\": container with ID starting with 8dc1b42fdc46fafb4fbef4e488775f5d3f9beba2befb0ea8cead0debf8fda4c8 not found: ID does not exist" Apr 24 17:10:38.855546 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855513 2573 scope.go:117] "RemoveContainer" containerID="0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624" Apr 24 17:10:38.855755 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:10:38.855740 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624\": container with ID starting with 0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624 not found: ID does not exist" containerID="0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624" Apr 24 17:10:38.855799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.855760 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624"} err="failed to get container status \"0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624\": rpc error: code = NotFound desc = could not find container \"0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624\": container with ID starting with 0b4b226e3258359cf5fecf1ddc0b4f00ae614fed2f5b1a57744502fc9afe2624 not found: ID does not exist" Apr 24 17:10:38.954361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.954323 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1cf9f61-843b-4d9e-855b-aacf98c500cf-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:10:38.954361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.954354 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mx4cc\" (UniqueName: \"kubernetes.io/projected/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kube-api-access-mx4cc\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:10:38.954361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.954364 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1cf9f61-843b-4d9e-855b-aacf98c500cf-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:10:38.954582 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:38.954376 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f1cf9f61-843b-4d9e-855b-aacf98c500cf-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:10:39.154716 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:39.154685 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq"] Apr 24 17:10:39.159586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:39.159552 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-t8nzq"] Apr 24 17:10:40.216730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:40.216695 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" path="/var/lib/kubelet/pods/f1cf9f61-843b-4d9e-855b-aacf98c500cf/volumes" Apr 24 17:10:40.840531 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:40.840492 2573 generic.go:358] "Generic (PLEG): container finished" podID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerID="004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b" exitCode=0 Apr 24 17:10:40.840678 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:40.840569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerDied","Data":"004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b"} Apr 24 17:10:41.845381 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:41.845340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerStarted","Data":"cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82"} Apr 24 17:10:41.845381 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:41.845385 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerStarted","Data":"ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30"} Apr 24 17:10:41.845930 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:41.845630 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:41.863919 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:41.863871 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podStartSLOduration=6.863858629 podStartE2EDuration="6.863858629s" podCreationTimestamp="2026-04-24 17:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:10:41.862919247 +0000 UTC m=+1900.127634673" watchObservedRunningTime="2026-04-24 17:10:41.863858629 +0000 UTC m=+1900.128574027" Apr 24 17:10:42.847926 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:42.847891 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:42.849049 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:42.849022 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:10:43.850984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:43.850940 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:10:48.856549 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:48.856520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:10:48.857149 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:48.857117 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:10:58.858028 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:10:58.857984 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:11:08.857759 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:11:08.857715 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:11:18.858112 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:11:18.858065 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:11:28.857257 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:11:28.857214 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:11:38.858103 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:11:38.858057 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:11:48.857108 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:11:48.857061 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:11:58.857296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:11:58.857191 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 24 17:12:02.217473 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:02.217443 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:12:06.409226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.409185 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p"] Apr 24 17:12:06.409693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.409541 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" containerID="cri-o://ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30" gracePeriod=30 Apr 24 17:12:06.409693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.409588 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kube-rbac-proxy" containerID="cri-o://cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82" gracePeriod=30 Apr 24 17:12:06.516535 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516494 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj"] Apr 24 17:12:06.516844 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516830 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="storage-initializer" Apr 24 17:12:06.516891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516846 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="storage-initializer" Apr 24 17:12:06.516891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516857 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kube-rbac-proxy" Apr 24 17:12:06.516891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516863 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kube-rbac-proxy" Apr 24 17:12:06.516891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516877 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" Apr 24 17:12:06.516891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516883 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" Apr 24 17:12:06.517079 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516942 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kserve-container" Apr 24 17:12:06.517079 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.516955 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1cf9f61-843b-4d9e-855b-aacf98c500cf" containerName="kube-rbac-proxy" Apr 24 17:12:06.520203 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.520178 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.522856 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.522831 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-73341c-predictor-serving-cert\"" Apr 24 17:12:06.523131 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.523115 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-73341c-kube-rbac-proxy-sar-config\"" Apr 24 17:12:06.530849 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.530813 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj"] Apr 24 17:12:06.568714 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.568677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-proxy-tls\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.568925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.568721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25ks\" (UniqueName: \"kubernetes.io/projected/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kube-api-access-s25ks\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.568925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.568838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kserve-provision-location\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.568925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.568903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-isvc-primary-73341c-kube-rbac-proxy-sar-config\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.669727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.669627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-proxy-tls\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.669727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.669675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s25ks\" (UniqueName: \"kubernetes.io/projected/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kube-api-access-s25ks\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.669727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.669710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kserve-provision-location\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.669968 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.669740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-isvc-primary-73341c-kube-rbac-proxy-sar-config\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.670266 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.670242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kserve-provision-location\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.670588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.670525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-isvc-primary-73341c-kube-rbac-proxy-sar-config\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.672489 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.672461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-proxy-tls\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.678603 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.678570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25ks\" (UniqueName: \"kubernetes.io/projected/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kube-api-access-s25ks\") pod \"isvc-primary-73341c-predictor-8549f8dbd-p8pcj\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.832157 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.832113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:06.964647 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.964551 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj"] Apr 24 17:12:06.967542 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:12:06.967510 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c5db1f_0c24_40e2_bdf4_49b335e16b9b.slice/crio-a7cf464876d23325cd9b11f90c95fbb5557722be826ef14b579a019bf8fc6246 WatchSource:0}: Error finding container a7cf464876d23325cd9b11f90c95fbb5557722be826ef14b579a019bf8fc6246: Status 404 returned error can't find the container with id a7cf464876d23325cd9b11f90c95fbb5557722be826ef14b579a019bf8fc6246 Apr 24 17:12:06.969452 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:06.969433 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:12:07.090524 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:07.090481 2573 generic.go:358] "Generic (PLEG): container finished" podID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerID="cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82" exitCode=2 Apr 24 17:12:07.090708 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:07.090556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerDied","Data":"cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82"} Apr 24 17:12:07.091899 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:07.091870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerStarted","Data":"f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32"} Apr 24 17:12:07.092039 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:07.091906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerStarted","Data":"a7cf464876d23325cd9b11f90c95fbb5557722be826ef14b579a019bf8fc6246"} Apr 24 17:12:08.851259 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:08.851213 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.36:8643/healthz\": dial tcp 10.134.0.36:8643: connect: connection refused" Apr 24 17:12:10.561928 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.561902 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:12:10.599925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.599822 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cfdj\" (UniqueName: \"kubernetes.io/projected/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kube-api-access-4cfdj\") pod \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " Apr 24 17:12:10.599925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.599871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " Apr 24 17:12:10.599925 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.599907 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kserve-provision-location\") pod \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " Apr 24 17:12:10.600195 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.599955 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls\") pod \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\" (UID: \"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf\") " Apr 24 17:12:10.600378 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.600336 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" (UID: "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:12:10.600378 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.600360 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" (UID: "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:12:10.602246 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.602206 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kube-api-access-4cfdj" (OuterVolumeSpecName: "kube-api-access-4cfdj") pod "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" (UID: "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf"). InnerVolumeSpecName "kube-api-access-4cfdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:12:10.602454 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.602257 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" (UID: "5dcaeb88-b6e5-4461-8ab3-7075914b3eaf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:12:10.700525 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.700485 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cfdj\" (UniqueName: \"kubernetes.io/projected/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kube-api-access-4cfdj\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:12:10.700525 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.700522 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:12:10.700525 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.700535 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:12:10.700786 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:10.700545 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:12:11.105240 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.105202 2573 generic.go:358] "Generic (PLEG): container finished" podID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerID="ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30" exitCode=0 Apr 24 17:12:11.105467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.105280 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerDied","Data":"ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30"} Apr 24 17:12:11.105467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.105336 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" event={"ID":"5dcaeb88-b6e5-4461-8ab3-7075914b3eaf","Type":"ContainerDied","Data":"50211578428a0a04064efaf3e2492366e02d2476e9878f7c90646cda8f739057"} Apr 24 17:12:11.105467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.105339 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p" Apr 24 17:12:11.105467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.105354 2573 scope.go:117] "RemoveContainer" containerID="cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82" Apr 24 17:12:11.106793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.106767 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerID="f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32" exitCode=0 Apr 24 17:12:11.106928 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.106859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerDied","Data":"f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32"} Apr 24 17:12:11.114109 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.114028 2573 scope.go:117] "RemoveContainer" containerID="ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30" Apr 24 17:12:11.122414 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.122391 2573 scope.go:117] "RemoveContainer" containerID="004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b" Apr 24 17:12:11.131688 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.131666 2573 scope.go:117] "RemoveContainer" containerID="cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82" Apr 24 17:12:11.132034 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:12:11.132013 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82\": container with ID starting with cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82 not found: ID does not exist" containerID="cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82" Apr 24 17:12:11.132117 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.132043 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82"} err="failed to get container status \"cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82\": rpc error: code = NotFound desc = could not find container \"cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82\": container with ID starting with cd149153a3d20cfb299a5d8721291467e794604eb7358f2d19b681acf7594f82 not found: ID does not exist" Apr 24 17:12:11.132117 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.132063 2573 scope.go:117] "RemoveContainer" containerID="ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30" Apr 24 17:12:11.132375 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:12:11.132356 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30\": container with ID starting with ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30 not found: ID does not exist" containerID="ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30" Apr 24 17:12:11.132432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.132381 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30"} err="failed to get container status \"ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30\": rpc error: code = NotFound desc = could not find container \"ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30\": container with ID starting with ddaa90b64b007cb31f3cbf428c0b8336e104ce06e53fd808473e55c3a7b62d30 not found: ID does not exist" Apr 24 17:12:11.132432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.132399 2573 scope.go:117] "RemoveContainer" containerID="004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b" Apr 24 17:12:11.132668 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:12:11.132652 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b\": container with ID starting with 004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b not found: ID does not exist" containerID="004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b" Apr 24 17:12:11.132717 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.132670 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b"} err="failed to get container status \"004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b\": rpc error: code = NotFound desc = could not find container \"004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b\": container with ID starting with 004eebe2f45c6e38cee608544158842c1d39860538208510d69133bfe6f2fd2b not found: ID does not exist" Apr 24 17:12:11.143922 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.143892 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p"] Apr 24 17:12:11.147980 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:11.147949 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-66p9p"] Apr 24 17:12:12.112567 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.112528 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerStarted","Data":"298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25"} Apr 24 17:12:12.112567 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.112565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerStarted","Data":"7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f"} Apr 24 17:12:12.113072 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.112871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:12.113072 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.112993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:12.114252 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.114223 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:12:12.132330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.132249 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podStartSLOduration=6.132232185 podStartE2EDuration="6.132232185s" podCreationTimestamp="2026-04-24 17:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:12:12.131124524 +0000 UTC m=+1990.395839923" watchObservedRunningTime="2026-04-24 17:12:12.132232185 +0000 UTC m=+1990.396947583" Apr 24 17:12:12.216758 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:12.216719 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" path="/var/lib/kubelet/pods/5dcaeb88-b6e5-4461-8ab3-7075914b3eaf/volumes" Apr 24 17:12:13.116059 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:13.116016 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:12:18.120518 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:18.120487 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:12:18.121018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:18.120984 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:12:28.121448 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:28.121407 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:12:38.120956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:38.120913 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:12:48.121916 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:48.121874 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:12:58.121102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:12:58.121060 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:13:08.121688 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:08.121647 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 24 17:13:18.122060 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:18.122027 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:13:26.692522 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692438 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w"] Apr 24 17:13:26.692915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692894 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" Apr 24 17:13:26.692915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692916 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" Apr 24 17:13:26.693084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692938 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="storage-initializer" Apr 24 17:13:26.693084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692947 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="storage-initializer" Apr 24 17:13:26.693084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692973 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kube-rbac-proxy" Apr 24 17:13:26.693084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.692981 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kube-rbac-proxy" Apr 24 17:13:26.693084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.693049 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kube-rbac-proxy" Apr 24 17:13:26.693084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.693063 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dcaeb88-b6e5-4461-8ab3-7075914b3eaf" containerName="kserve-container" Apr 24 17:13:26.696699 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.696680 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.699444 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.699401 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-73341c-kube-rbac-proxy-sar-config\"" Apr 24 17:13:26.699444 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.699410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 17:13:26.699664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.699527 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-73341c-dockercfg-mp94p\"" Apr 24 17:13:26.699664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.699530 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-73341c-predictor-serving-cert\"" Apr 24 17:13:26.699664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.699537 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-73341c\"" Apr 24 17:13:26.718662 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.718630 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w"] Apr 24 17:13:26.791831 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.791791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da43a52b-29e5-4e7d-aee5-af05e07d8566-proxy-tls\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.792018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.791852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-isvc-secondary-73341c-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.792018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.791898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da43a52b-29e5-4e7d-aee5-af05e07d8566-kserve-provision-location\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.792018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.791918 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdtfr\" (UniqueName: \"kubernetes.io/projected/da43a52b-29e5-4e7d-aee5-af05e07d8566-kube-api-access-zdtfr\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.792018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.791937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-cabundle-cert\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da43a52b-29e5-4e7d-aee5-af05e07d8566-kserve-provision-location\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893388 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdtfr\" (UniqueName: \"kubernetes.io/projected/da43a52b-29e5-4e7d-aee5-af05e07d8566-kube-api-access-zdtfr\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893388 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-cabundle-cert\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893388 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da43a52b-29e5-4e7d-aee5-af05e07d8566-proxy-tls\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893388 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-isvc-secondary-73341c-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da43a52b-29e5-4e7d-aee5-af05e07d8566-kserve-provision-location\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893931 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-cabundle-cert\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.893931 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.893924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-isvc-secondary-73341c-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.895964 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.895937 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da43a52b-29e5-4e7d-aee5-af05e07d8566-proxy-tls\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:26.900924 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:26.900900 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdtfr\" (UniqueName: \"kubernetes.io/projected/da43a52b-29e5-4e7d-aee5-af05e07d8566-kube-api-access-zdtfr\") pod \"isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:27.015214 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:27.015168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:27.142969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:27.142828 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w"] Apr 24 17:13:27.145899 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:13:27.145867 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda43a52b_29e5_4e7d_aee5_af05e07d8566.slice/crio-ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928 WatchSource:0}: Error finding container ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928: Status 404 returned error can't find the container with id ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928 Apr 24 17:13:27.333785 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:27.333685 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" event={"ID":"da43a52b-29e5-4e7d-aee5-af05e07d8566","Type":"ContainerStarted","Data":"5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00"} Apr 24 17:13:27.333785 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:27.333737 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" event={"ID":"da43a52b-29e5-4e7d-aee5-af05e07d8566","Type":"ContainerStarted","Data":"ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928"} Apr 24 17:13:32.349833 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:32.349803 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/0.log" Apr 24 17:13:32.350216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:32.349846 2573 generic.go:358] "Generic (PLEG): container finished" podID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerID="5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00" exitCode=1 Apr 24 17:13:32.350216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:32.349910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" event={"ID":"da43a52b-29e5-4e7d-aee5-af05e07d8566","Type":"ContainerDied","Data":"5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00"} Apr 24 17:13:33.354693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:33.354660 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/0.log" Apr 24 17:13:33.355090 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:33.354747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" event={"ID":"da43a52b-29e5-4e7d-aee5-af05e07d8566","Type":"ContainerStarted","Data":"f20686336b638caa4d2fe61c2f7aa5d250b83c376b4ae0275b2f99051bdfc472"} Apr 24 17:13:37.366883 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.366852 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/1.log" Apr 24 17:13:37.367268 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.367200 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/0.log" Apr 24 17:13:37.367268 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.367232 2573 generic.go:358] "Generic (PLEG): container finished" podID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerID="f20686336b638caa4d2fe61c2f7aa5d250b83c376b4ae0275b2f99051bdfc472" exitCode=1 Apr 24 17:13:37.367379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.367290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" event={"ID":"da43a52b-29e5-4e7d-aee5-af05e07d8566","Type":"ContainerDied","Data":"f20686336b638caa4d2fe61c2f7aa5d250b83c376b4ae0275b2f99051bdfc472"} Apr 24 17:13:37.367379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.367349 2573 scope.go:117] "RemoveContainer" containerID="5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00" Apr 24 17:13:37.367747 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.367726 2573 scope.go:117] "RemoveContainer" containerID="5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00" Apr 24 17:13:37.378024 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:37.377993 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_kserve-ci-e2e-test_da43a52b-29e5-4e7d-aee5-af05e07d8566_0 in pod sandbox ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928 from index: no such id: '5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00'" containerID="5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00" Apr 24 17:13:37.378103 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:37.378039 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_kserve-ci-e2e-test_da43a52b-29e5-4e7d-aee5-af05e07d8566_0 in pod sandbox ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928 from index: no such id: '5be3694b8abce668068001c648cb2b4f7662277d17ad6752f90f2f2ba73a9b00'" Apr 24 17:13:37.378237 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:37.378216 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_kserve-ci-e2e-test(da43a52b-29e5-4e7d-aee5-af05e07d8566)\"" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" Apr 24 17:13:38.371179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:38.371148 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/1.log" Apr 24 17:13:42.763657 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.763622 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w"] Apr 24 17:13:42.808842 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.808806 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj"] Apr 24 17:13:42.809207 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.809178 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" containerID="cri-o://7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f" gracePeriod=30 Apr 24 17:13:42.809323 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.809257 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kube-rbac-proxy" containerID="cri-o://298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25" gracePeriod=30 Apr 24 17:13:42.915521 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.915493 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879"] Apr 24 17:13:42.920241 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.920219 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:42.922825 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.922806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-1c499d-predictor-serving-cert\"" Apr 24 17:13:42.922911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.922812 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-1c499d-dockercfg-4f9sn\"" Apr 24 17:13:42.922911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.922859 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\"" Apr 24 17:13:42.929574 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.929556 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-1c499d\"" Apr 24 17:13:42.930198 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.930182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/1.log" Apr 24 17:13:42.930272 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.930244 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:42.950082 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:42.950050 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879"] Apr 24 17:13:43.018956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.018865 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-cabundle-cert\") pod \"da43a52b-29e5-4e7d-aee5-af05e07d8566\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " Apr 24 17:13:43.018956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.018915 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da43a52b-29e5-4e7d-aee5-af05e07d8566-kserve-provision-location\") pod \"da43a52b-29e5-4e7d-aee5-af05e07d8566\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " Apr 24 17:13:43.018956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.018934 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdtfr\" (UniqueName: \"kubernetes.io/projected/da43a52b-29e5-4e7d-aee5-af05e07d8566-kube-api-access-zdtfr\") pod \"da43a52b-29e5-4e7d-aee5-af05e07d8566\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " Apr 24 17:13:43.019193 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019064 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da43a52b-29e5-4e7d-aee5-af05e07d8566-proxy-tls\") pod \"da43a52b-29e5-4e7d-aee5-af05e07d8566\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " Apr 24 17:13:43.019193 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019186 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-isvc-secondary-73341c-kube-rbac-proxy-sar-config\") pod \"da43a52b-29e5-4e7d-aee5-af05e07d8566\" (UID: \"da43a52b-29e5-4e7d-aee5-af05e07d8566\") " Apr 24 17:13:43.019277 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019217 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da43a52b-29e5-4e7d-aee5-af05e07d8566-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da43a52b-29e5-4e7d-aee5-af05e07d8566" (UID: "da43a52b-29e5-4e7d-aee5-af05e07d8566"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:43.019373 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.019373 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019361 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "da43a52b-29e5-4e7d-aee5-af05e07d8566" (UID: "da43a52b-29e5-4e7d-aee5-af05e07d8566"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:13:43.019483 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019421 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19716ec9-a7ef-403c-92ff-331d4a97d4da-kserve-provision-location\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.019570 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-cabundle-cert\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.019633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.019633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019596 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-isvc-secondary-73341c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-73341c-kube-rbac-proxy-sar-config") pod "da43a52b-29e5-4e7d-aee5-af05e07d8566" (UID: "da43a52b-29e5-4e7d-aee5-af05e07d8566"). InnerVolumeSpecName "isvc-secondary-73341c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:13:43.019729 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cdxx\" (UniqueName: \"kubernetes.io/projected/19716ec9-a7ef-403c-92ff-331d4a97d4da-kube-api-access-5cdxx\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.019975 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019959 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-isvc-secondary-73341c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:43.020013 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019982 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da43a52b-29e5-4e7d-aee5-af05e07d8566-cabundle-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:43.020013 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.019999 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da43a52b-29e5-4e7d-aee5-af05e07d8566-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:43.021513 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.021485 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da43a52b-29e5-4e7d-aee5-af05e07d8566-kube-api-access-zdtfr" (OuterVolumeSpecName: "kube-api-access-zdtfr") pod "da43a52b-29e5-4e7d-aee5-af05e07d8566" (UID: "da43a52b-29e5-4e7d-aee5-af05e07d8566"). InnerVolumeSpecName "kube-api-access-zdtfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:13:43.021513 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.021497 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da43a52b-29e5-4e7d-aee5-af05e07d8566-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da43a52b-29e5-4e7d-aee5-af05e07d8566" (UID: "da43a52b-29e5-4e7d-aee5-af05e07d8566"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:13:43.117173 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.117124 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.37:8643/healthz\": dial tcp 10.134.0.37:8643: connect: connection refused" Apr 24 17:13:43.120529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.120613 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19716ec9-a7ef-403c-92ff-331d4a97d4da-kserve-provision-location\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.120613 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-cabundle-cert\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.120613 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.120759 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cdxx\" (UniqueName: \"kubernetes.io/projected/19716ec9-a7ef-403c-92ff-331d4a97d4da-kube-api-access-5cdxx\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.120759 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120668 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zdtfr\" (UniqueName: \"kubernetes.io/projected/da43a52b-29e5-4e7d-aee5-af05e07d8566-kube-api-access-zdtfr\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:43.120759 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.120680 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da43a52b-29e5-4e7d-aee5-af05e07d8566-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:43.120885 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:43.120781 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-serving-cert: secret "isvc-init-fail-1c499d-predictor-serving-cert" not found Apr 24 17:13:43.120885 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:43.120870 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls podName:19716ec9-a7ef-403c-92ff-331d4a97d4da nodeName:}" failed. No retries permitted until 2026-04-24 17:13:43.620845194 +0000 UTC m=+2081.885560579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls") pod "isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da") : secret "isvc-init-fail-1c499d-predictor-serving-cert" not found Apr 24 17:13:43.121046 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.121027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19716ec9-a7ef-403c-92ff-331d4a97d4da-kserve-provision-location\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.121157 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.121138 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.121193 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.121175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-cabundle-cert\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.128961 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.128939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cdxx\" (UniqueName: \"kubernetes.io/projected/19716ec9-a7ef-403c-92ff-331d4a97d4da-kube-api-access-5cdxx\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.386123 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.386044 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w_da43a52b-29e5-4e7d-aee5-af05e07d8566/storage-initializer/1.log" Apr 24 17:13:43.386277 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.386149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" event={"ID":"da43a52b-29e5-4e7d-aee5-af05e07d8566","Type":"ContainerDied","Data":"ee6e21aced0e69c1beb2272b7c91d18947557f8417bcf6008e5ac5cdb89ad928"} Apr 24 17:13:43.386277 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.386197 2573 scope.go:117] "RemoveContainer" containerID="f20686336b638caa4d2fe61c2f7aa5d250b83c376b4ae0275b2f99051bdfc472" Apr 24 17:13:43.386277 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.386200 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w" Apr 24 17:13:43.388508 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.388483 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerID="298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25" exitCode=2 Apr 24 17:13:43.388632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.388552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerDied","Data":"298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25"} Apr 24 17:13:43.421700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.421666 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w"] Apr 24 17:13:43.425575 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.425547 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-73341c-predictor-7b74fcc47f-jcz9w"] Apr 24 17:13:43.625170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:43.625123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:43.625352 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:43.625292 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-serving-cert: secret "isvc-init-fail-1c499d-predictor-serving-cert" not found Apr 24 17:13:43.625419 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:43.625405 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls podName:19716ec9-a7ef-403c-92ff-331d4a97d4da nodeName:}" failed. No retries permitted until 2026-04-24 17:13:44.625386526 +0000 UTC m=+2082.890101901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls") pod "isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da") : secret "isvc-init-fail-1c499d-predictor-serving-cert" not found Apr 24 17:13:44.218633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:44.218595 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" path="/var/lib/kubelet/pods/da43a52b-29e5-4e7d-aee5-af05e07d8566/volumes" Apr 24 17:13:44.634353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:44.634291 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:44.636988 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:44.636967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") pod \"isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:44.738855 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:44.738817 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:44.870333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:44.870279 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879"] Apr 24 17:13:44.873805 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:13:44.873768 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19716ec9_a7ef_403c_92ff_331d4a97d4da.slice/crio-4e5c438f3161ad48c969ab9b22c9a2c0305a18c463da8f7e7e63871288064bc1 WatchSource:0}: Error finding container 4e5c438f3161ad48c969ab9b22c9a2c0305a18c463da8f7e7e63871288064bc1: Status 404 returned error can't find the container with id 4e5c438f3161ad48c969ab9b22c9a2c0305a18c463da8f7e7e63871288064bc1 Apr 24 17:13:45.396730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:45.396692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" event={"ID":"19716ec9-a7ef-403c-92ff-331d4a97d4da","Type":"ContainerStarted","Data":"1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f"} Apr 24 17:13:45.396730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:45.396733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" event={"ID":"19716ec9-a7ef-403c-92ff-331d4a97d4da","Type":"ContainerStarted","Data":"4e5c438f3161ad48c969ab9b22c9a2c0305a18c463da8f7e7e63871288064bc1"} Apr 24 17:13:47.653940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.653915 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:13:47.758597 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.758557 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s25ks\" (UniqueName: \"kubernetes.io/projected/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kube-api-access-s25ks\") pod \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " Apr 24 17:13:47.758777 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.758627 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-isvc-primary-73341c-kube-rbac-proxy-sar-config\") pod \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " Apr 24 17:13:47.758777 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.758655 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kserve-provision-location\") pod \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " Apr 24 17:13:47.758777 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.758688 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-proxy-tls\") pod \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\" (UID: \"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b\") " Apr 24 17:13:47.759000 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.758975 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" (UID: "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:47.759074 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.759007 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-isvc-primary-73341c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-73341c-kube-rbac-proxy-sar-config") pod "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" (UID: "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b"). InnerVolumeSpecName "isvc-primary-73341c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:13:47.760982 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.760952 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" (UID: "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:13:47.760982 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.760969 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kube-api-access-s25ks" (OuterVolumeSpecName: "kube-api-access-s25ks") pod "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" (UID: "f8c5db1f-0c24-40e2-bdf4-49b335e16b9b"). InnerVolumeSpecName "kube-api-access-s25ks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:13:47.860216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.860171 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-73341c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-isvc-primary-73341c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:47.860425 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.860232 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:47.860425 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.860243 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:47.860425 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:47.860252 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s25ks\" (UniqueName: \"kubernetes.io/projected/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b-kube-api-access-s25ks\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:48.407794 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.407761 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerID="7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f" exitCode=0 Apr 24 17:13:48.407969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.407852 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" Apr 24 17:13:48.407969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.407847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerDied","Data":"7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f"} Apr 24 17:13:48.407969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.407965 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj" event={"ID":"f8c5db1f-0c24-40e2-bdf4-49b335e16b9b","Type":"ContainerDied","Data":"a7cf464876d23325cd9b11f90c95fbb5557722be826ef14b579a019bf8fc6246"} Apr 24 17:13:48.408097 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.407981 2573 scope.go:117] "RemoveContainer" containerID="298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25" Apr 24 17:13:48.416460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.416442 2573 scope.go:117] "RemoveContainer" containerID="7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f" Apr 24 17:13:48.424235 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.424186 2573 scope.go:117] "RemoveContainer" containerID="f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32" Apr 24 17:13:48.426083 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.426052 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj"] Apr 24 17:13:48.429125 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.429101 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-73341c-predictor-8549f8dbd-p8pcj"] Apr 24 17:13:48.432888 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.432859 2573 scope.go:117] "RemoveContainer" containerID="298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25" Apr 24 17:13:48.433200 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:48.433179 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25\": container with ID starting with 298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25 not found: ID does not exist" containerID="298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25" Apr 24 17:13:48.433263 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.433213 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25"} err="failed to get container status \"298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25\": rpc error: code = NotFound desc = could not find container \"298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25\": container with ID starting with 298cdf88eb6a1c527f75a9dddd69e7dc70a39bec724f7b10bf3e088ad0fdcc25 not found: ID does not exist" Apr 24 17:13:48.433263 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.433235 2573 scope.go:117] "RemoveContainer" containerID="7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f" Apr 24 17:13:48.433616 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:48.433593 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f\": container with ID starting with 7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f not found: ID does not exist" containerID="7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f" Apr 24 17:13:48.433688 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.433621 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f"} err="failed to get container status \"7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f\": rpc error: code = NotFound desc = could not find container \"7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f\": container with ID starting with 7d16d41cf466ee46b4ed962d13c9e862cbc4db51e24f37b31501c5c05c88098f not found: ID does not exist" Apr 24 17:13:48.433688 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.433638 2573 scope.go:117] "RemoveContainer" containerID="f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32" Apr 24 17:13:48.433877 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:48.433861 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32\": container with ID starting with f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32 not found: ID does not exist" containerID="f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32" Apr 24 17:13:48.433915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:48.433882 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32"} err="failed to get container status \"f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32\": rpc error: code = NotFound desc = could not find container \"f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32\": container with ID starting with f21de3fde595d780adea1e5c131e54be76edb0b716f64f763c80df8afc787c32 not found: ID does not exist" Apr 24 17:13:50.216936 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:50.216902 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" path="/var/lib/kubelet/pods/f8c5db1f-0c24-40e2-bdf4-49b335e16b9b/volumes" Apr 24 17:13:51.417634 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:51.417605 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879_19716ec9-a7ef-403c-92ff-331d4a97d4da/storage-initializer/0.log" Apr 24 17:13:51.418021 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:51.417647 2573 generic.go:358] "Generic (PLEG): container finished" podID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerID="1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f" exitCode=1 Apr 24 17:13:51.418021 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:51.417705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" event={"ID":"19716ec9-a7ef-403c-92ff-331d4a97d4da","Type":"ContainerDied","Data":"1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f"} Apr 24 17:13:52.422487 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:52.422457 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879_19716ec9-a7ef-403c-92ff-331d4a97d4da/storage-initializer/0.log" Apr 24 17:13:52.422879 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:52.422510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" event={"ID":"19716ec9-a7ef-403c-92ff-331d4a97d4da","Type":"ContainerStarted","Data":"21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d"} Apr 24 17:13:52.893888 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:52.893846 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879"] Apr 24 17:13:53.425575 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.425527 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" containerID="cri-o://21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d" gracePeriod=30 Apr 24 17:13:53.716764 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.716679 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn"] Apr 24 17:13:53.717041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717026 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerName="storage-initializer" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717044 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerName="storage-initializer" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717061 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerName="storage-initializer" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717068 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerName="storage-initializer" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717078 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kube-rbac-proxy" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717084 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kube-rbac-proxy" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717093 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="storage-initializer" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717099 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="storage-initializer" Apr 24 17:13:53.717107 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717106 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" Apr 24 17:13:53.717391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717111 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" Apr 24 17:13:53.717391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717168 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kserve-container" Apr 24 17:13:53.717391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717180 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerName="storage-initializer" Apr 24 17:13:53.717391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717186 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c5db1f-0c24-40e2-bdf4-49b335e16b9b" containerName="kube-rbac-proxy" Apr 24 17:13:53.717391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.717192 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="da43a52b-29e5-4e7d-aee5-af05e07d8566" containerName="storage-initializer" Apr 24 17:13:53.720291 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.720270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.722623 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.722600 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2jnrr\"" Apr 24 17:13:53.722757 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.722672 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 24 17:13:53.722757 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.722676 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 17:13:53.730001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.729972 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn"] Apr 24 17:13:53.813681 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.813635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8tzd\" (UniqueName: \"kubernetes.io/projected/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kube-api-access-q8tzd\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.813878 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.813713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.813878 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.813742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.813878 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.813789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.914680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.914624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.914887 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.914706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8tzd\" (UniqueName: \"kubernetes.io/projected/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kube-api-access-q8tzd\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.914887 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.914752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.914887 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.914777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.915182 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.915159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.915410 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.915392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.917371 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.917339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:53.923095 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:53.923070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8tzd\" (UniqueName: \"kubernetes.io/projected/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kube-api-access-q8tzd\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:54.030837 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:54.030805 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:13:54.157032 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:54.156998 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn"] Apr 24 17:13:54.160378 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:13:54.160342 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e06f3d_05a9_4fc7_afcc_af85858e01f1.slice/crio-ca9ba51d8c8e65caafc5d54a35594b9da2745a2b0b3ae0197eae029f72946e49 WatchSource:0}: Error finding container ca9ba51d8c8e65caafc5d54a35594b9da2745a2b0b3ae0197eae029f72946e49: Status 404 returned error can't find the container with id ca9ba51d8c8e65caafc5d54a35594b9da2745a2b0b3ae0197eae029f72946e49 Apr 24 17:13:54.430143 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:54.430050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerStarted","Data":"27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb"} Apr 24 17:13:54.430143 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:54.430100 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerStarted","Data":"ca9ba51d8c8e65caafc5d54a35594b9da2745a2b0b3ae0197eae029f72946e49"} Apr 24 17:13:57.770461 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.770433 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879_19716ec9-a7ef-403c-92ff-331d4a97d4da/storage-initializer/1.log" Apr 24 17:13:57.770803 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.770774 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879_19716ec9-a7ef-403c-92ff-331d4a97d4da/storage-initializer/0.log" Apr 24 17:13:57.770845 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.770836 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:57.850280 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850185 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\") pod \"19716ec9-a7ef-403c-92ff-331d4a97d4da\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " Apr 24 17:13:57.850280 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850273 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19716ec9-a7ef-403c-92ff-331d4a97d4da-kserve-provision-location\") pod \"19716ec9-a7ef-403c-92ff-331d4a97d4da\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " Apr 24 17:13:57.850521 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850298 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") pod \"19716ec9-a7ef-403c-92ff-331d4a97d4da\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " Apr 24 17:13:57.850521 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850353 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-cabundle-cert\") pod \"19716ec9-a7ef-403c-92ff-331d4a97d4da\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " Apr 24 17:13:57.850521 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850373 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cdxx\" (UniqueName: \"kubernetes.io/projected/19716ec9-a7ef-403c-92ff-331d4a97d4da-kube-api-access-5cdxx\") pod \"19716ec9-a7ef-403c-92ff-331d4a97d4da\" (UID: \"19716ec9-a7ef-403c-92ff-331d4a97d4da\") " Apr 24 17:13:57.850641 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850589 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-isvc-init-fail-1c499d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-1c499d-kube-rbac-proxy-sar-config") pod "19716ec9-a7ef-403c-92ff-331d4a97d4da" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da"). InnerVolumeSpecName "isvc-init-fail-1c499d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:13:57.850641 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850626 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19716ec9-a7ef-403c-92ff-331d4a97d4da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "19716ec9-a7ef-403c-92ff-331d4a97d4da" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:13:57.850758 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.850731 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "19716ec9-a7ef-403c-92ff-331d4a97d4da" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:13:57.852684 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.852598 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19716ec9-a7ef-403c-92ff-331d4a97d4da-kube-api-access-5cdxx" (OuterVolumeSpecName: "kube-api-access-5cdxx") pod "19716ec9-a7ef-403c-92ff-331d4a97d4da" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da"). InnerVolumeSpecName "kube-api-access-5cdxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:13:57.852979 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.852955 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "19716ec9-a7ef-403c-92ff-331d4a97d4da" (UID: "19716ec9-a7ef-403c-92ff-331d4a97d4da"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:13:57.951685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.951644 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-isvc-init-fail-1c499d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:57.951685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.951681 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/19716ec9-a7ef-403c-92ff-331d4a97d4da-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:57.951685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.951691 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19716ec9-a7ef-403c-92ff-331d4a97d4da-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:57.951952 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.951700 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/19716ec9-a7ef-403c-92ff-331d4a97d4da-cabundle-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:57.951952 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:57.951711 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cdxx\" (UniqueName: \"kubernetes.io/projected/19716ec9-a7ef-403c-92ff-331d4a97d4da-kube-api-access-5cdxx\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:13:58.441743 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.441713 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879_19716ec9-a7ef-403c-92ff-331d4a97d4da/storage-initializer/1.log" Apr 24 17:13:58.442145 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.442126 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879_19716ec9-a7ef-403c-92ff-331d4a97d4da/storage-initializer/0.log" Apr 24 17:13:58.442219 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.442177 2573 generic.go:358] "Generic (PLEG): container finished" podID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerID="21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d" exitCode=1 Apr 24 17:13:58.442270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.442250 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" event={"ID":"19716ec9-a7ef-403c-92ff-331d4a97d4da","Type":"ContainerDied","Data":"21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d"} Apr 24 17:13:58.442270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.442260 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" Apr 24 17:13:58.442392 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.442286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879" event={"ID":"19716ec9-a7ef-403c-92ff-331d4a97d4da","Type":"ContainerDied","Data":"4e5c438f3161ad48c969ab9b22c9a2c0305a18c463da8f7e7e63871288064bc1"} Apr 24 17:13:58.442392 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.442330 2573 scope.go:117] "RemoveContainer" containerID="21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d" Apr 24 17:13:58.443904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.443877 2573 generic.go:358] "Generic (PLEG): container finished" podID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerID="27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb" exitCode=0 Apr 24 17:13:58.444007 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.443944 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerDied","Data":"27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb"} Apr 24 17:13:58.450674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.450647 2573 scope.go:117] "RemoveContainer" containerID="1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f" Apr 24 17:13:58.458801 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.458783 2573 scope.go:117] "RemoveContainer" containerID="21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d" Apr 24 17:13:58.459092 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:58.459068 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d\": container with ID starting with 21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d not found: ID does not exist" containerID="21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d" Apr 24 17:13:58.459146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.459101 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d"} err="failed to get container status \"21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d\": rpc error: code = NotFound desc = could not find container \"21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d\": container with ID starting with 21fbee4d320a10ea82c4a90f537ed99305821215135d1a98ca7b96d93dfa656d not found: ID does not exist" Apr 24 17:13:58.459146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.459124 2573 scope.go:117] "RemoveContainer" containerID="1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f" Apr 24 17:13:58.459404 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:13:58.459383 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f\": container with ID starting with 1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f not found: ID does not exist" containerID="1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f" Apr 24 17:13:58.459477 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.459416 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f"} err="failed to get container status \"1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f\": rpc error: code = NotFound desc = could not find container \"1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f\": container with ID starting with 1bdc373a9848d1527df6ee6a8bfdb7b9df0fc76fb8416e4dae9387d5bd87be8f not found: ID does not exist" Apr 24 17:13:58.482134 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.482044 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879"] Apr 24 17:13:58.484255 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:13:58.484225 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1c499d-predictor-5b69c8b74f-2r879"] Apr 24 17:14:00.217297 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:00.217253 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" path="/var/lib/kubelet/pods/19716ec9-a7ef-403c-92ff-331d4a97d4da/volumes" Apr 24 17:14:02.243752 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:02.243709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:14:02.246503 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:02.246476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:14:23.524216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:23.524181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerStarted","Data":"f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d"} Apr 24 17:14:23.524216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:23.524222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerStarted","Data":"810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e"} Apr 24 17:14:23.524666 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:23.524571 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:14:23.524706 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:23.524690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:14:23.526007 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:23.525975 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:14:23.541860 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:23.541682 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podStartSLOduration=6.171399633 podStartE2EDuration="30.541641207s" podCreationTimestamp="2026-04-24 17:13:53 +0000 UTC" firstStartedPulling="2026-04-24 17:13:58.44540745 +0000 UTC m=+2096.710122826" lastFinishedPulling="2026-04-24 17:14:22.815649011 +0000 UTC m=+2121.080364400" observedRunningTime="2026-04-24 17:14:23.541406872 +0000 UTC m=+2121.806122329" watchObservedRunningTime="2026-04-24 17:14:23.541641207 +0000 UTC m=+2121.806356612" Apr 24 17:14:24.527518 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:24.527472 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:14:29.531603 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:29.531573 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:14:29.532420 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:29.532379 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:14:39.532248 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:39.532197 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:14:49.532263 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:49.532201 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:14:59.532700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:14:59.532652 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:15:09.532576 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:09.532528 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:15:19.532385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:19.532343 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:15:29.532702 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:29.532663 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:15:38.216955 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:38.216927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:15:43.360645 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.360603 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn"] Apr 24 17:15:43.361140 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.360949 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" containerID="cri-o://810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e" gracePeriod=30 Apr 24 17:15:43.361140 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.361000 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kube-rbac-proxy" containerID="cri-o://f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d" gracePeriod=30 Apr 24 17:15:43.586783 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.586735 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf"] Apr 24 17:15:43.587049 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.587036 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" Apr 24 17:15:43.587105 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.587050 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" Apr 24 17:15:43.587105 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.587062 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" Apr 24 17:15:43.587105 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.587068 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" Apr 24 17:15:43.587204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.587116 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" Apr 24 17:15:43.587237 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.587211 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="19716ec9-a7ef-403c-92ff-331d4a97d4da" containerName="storage-initializer" Apr 24 17:15:43.590193 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.590158 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.593185 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.593150 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 24 17:15:43.593348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.593156 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 17:15:43.596152 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.596125 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf"] Apr 24 17:15:43.635219 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.635113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.635219 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.635164 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.635219 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.635189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtx9\" (UniqueName: \"kubernetes.io/projected/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kube-api-access-hgtx9\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.635504 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.635276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.736624 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.736567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.736624 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.736623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.736895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.736644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtx9\" (UniqueName: \"kubernetes.io/projected/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kube-api-access-hgtx9\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.736895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.736677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.737174 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.737153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.737415 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.737394 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.739228 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.739209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.746096 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.746061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtx9\" (UniqueName: \"kubernetes.io/projected/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kube-api-access-hgtx9\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:43.747536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.747509 2573 generic.go:358] "Generic (PLEG): container finished" podID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerID="f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d" exitCode=2 Apr 24 17:15:43.747622 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.747577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerDied","Data":"f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d"} Apr 24 17:15:43.902060 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:43.901874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:44.033508 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:44.033480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf"] Apr 24 17:15:44.036082 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:15:44.036044 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af3c6b6_2a66_4db5_a7fd_2aad0e926c46.slice/crio-29077fc21c76d0536cc7891e8e079dddd4970a0f9450f7d1b2f250f6b0620403 WatchSource:0}: Error finding container 29077fc21c76d0536cc7891e8e079dddd4970a0f9450f7d1b2f250f6b0620403: Status 404 returned error can't find the container with id 29077fc21c76d0536cc7891e8e079dddd4970a0f9450f7d1b2f250f6b0620403 Apr 24 17:15:44.528536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:44.528474 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.40:8643/healthz\": dial tcp 10.134.0.40:8643: connect: connection refused" Apr 24 17:15:44.751808 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:44.751765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerStarted","Data":"9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0"} Apr 24 17:15:44.751808 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:44.751808 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerStarted","Data":"29077fc21c76d0536cc7891e8e079dddd4970a0f9450f7d1b2f250f6b0620403"} Apr 24 17:15:47.762112 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:47.762076 2573 generic.go:358] "Generic (PLEG): container finished" podID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerID="9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0" exitCode=0 Apr 24 17:15:47.762562 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:47.762149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerDied","Data":"9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0"} Apr 24 17:15:48.213682 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.213590 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 24 17:15:48.702270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.702245 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:15:48.767397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.767359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerStarted","Data":"620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9"} Apr 24 17:15:48.767397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.767401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerStarted","Data":"add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec"} Apr 24 17:15:48.767891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.767719 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:48.767891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.767779 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:48.769067 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.769041 2573 generic.go:358] "Generic (PLEG): container finished" podID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerID="810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e" exitCode=0 Apr 24 17:15:48.769203 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.769078 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerDied","Data":"810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e"} Apr 24 17:15:48.769203 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.769107 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" event={"ID":"a6e06f3d-05a9-4fc7-afcc-af85858e01f1","Type":"ContainerDied","Data":"ca9ba51d8c8e65caafc5d54a35594b9da2745a2b0b3ae0197eae029f72946e49"} Apr 24 17:15:48.769203 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.769122 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn" Apr 24 17:15:48.769320 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.769208 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:15:48.769320 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.769128 2573 scope.go:117] "RemoveContainer" containerID="f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d" Apr 24 17:15:48.777047 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.777027 2573 scope.go:117] "RemoveContainer" containerID="810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e" Apr 24 17:15:48.780369 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.780351 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kserve-provision-location\") pod \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " Apr 24 17:15:48.780489 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.780386 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " Apr 24 17:15:48.780489 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.780421 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-proxy-tls\") pod \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " Apr 24 17:15:48.780600 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.780523 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8tzd\" (UniqueName: \"kubernetes.io/projected/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kube-api-access-q8tzd\") pod \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\" (UID: \"a6e06f3d-05a9-4fc7-afcc-af85858e01f1\") " Apr 24 17:15:48.780719 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.780693 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a6e06f3d-05a9-4fc7-afcc-af85858e01f1" (UID: "a6e06f3d-05a9-4fc7-afcc-af85858e01f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:15:48.780800 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.780755 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "a6e06f3d-05a9-4fc7-afcc-af85858e01f1" (UID: "a6e06f3d-05a9-4fc7-afcc-af85858e01f1"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:15:48.782679 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.782646 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a6e06f3d-05a9-4fc7-afcc-af85858e01f1" (UID: "a6e06f3d-05a9-4fc7-afcc-af85858e01f1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:15:48.782768 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.782703 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kube-api-access-q8tzd" (OuterVolumeSpecName: "kube-api-access-q8tzd") pod "a6e06f3d-05a9-4fc7-afcc-af85858e01f1" (UID: "a6e06f3d-05a9-4fc7-afcc-af85858e01f1"). InnerVolumeSpecName "kube-api-access-q8tzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:15:48.784747 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.784730 2573 scope.go:117] "RemoveContainer" containerID="27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb" Apr 24 17:15:48.788661 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.788622 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podStartSLOduration=5.788607758 podStartE2EDuration="5.788607758s" podCreationTimestamp="2026-04-24 17:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:15:48.786965931 +0000 UTC m=+2207.051681328" watchObservedRunningTime="2026-04-24 17:15:48.788607758 +0000 UTC m=+2207.053323153" Apr 24 17:15:48.796537 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.796516 2573 scope.go:117] "RemoveContainer" containerID="f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d" Apr 24 17:15:48.796868 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:15:48.796846 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d\": container with ID starting with f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d not found: ID does not exist" containerID="f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d" Apr 24 17:15:48.796946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.796882 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d"} err="failed to get container status \"f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d\": rpc error: code = NotFound desc = could not find container \"f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d\": container with ID starting with f8de8948e38b3aa1ccd82d98784ec6bae81c891cb2c905c5e39b3fcf0058b39d not found: ID does not exist" Apr 24 17:15:48.796946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.796905 2573 scope.go:117] "RemoveContainer" containerID="810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e" Apr 24 17:15:48.797164 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:15:48.797147 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e\": container with ID starting with 810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e not found: ID does not exist" containerID="810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e" Apr 24 17:15:48.797209 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.797171 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e"} err="failed to get container status \"810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e\": rpc error: code = NotFound desc = could not find container \"810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e\": container with ID starting with 810614cb04aaeba677273ba72a12e4a121a40cd517383f93d1e4f8c5951bb91e not found: ID does not exist" Apr 24 17:15:48.797209 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.797186 2573 scope.go:117] "RemoveContainer" containerID="27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb" Apr 24 17:15:48.797467 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:15:48.797449 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb\": container with ID starting with 27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb not found: ID does not exist" containerID="27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb" Apr 24 17:15:48.797527 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.797470 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb"} err="failed to get container status \"27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb\": rpc error: code = NotFound desc = could not find container \"27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb\": container with ID starting with 27287204a84cbc52122afd9520f969a9ec25a41761f56cc8d65fda73647e8ccb not found: ID does not exist" Apr 24 17:15:48.881809 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.881768 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8tzd\" (UniqueName: \"kubernetes.io/projected/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kube-api-access-q8tzd\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:15:48.881809 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.881805 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:15:48.881809 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.881816 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:15:48.882102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:48.881826 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6e06f3d-05a9-4fc7-afcc-af85858e01f1-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:15:49.090397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:49.090359 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn"] Apr 24 17:15:49.093950 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:49.093919 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-wv5nn"] Apr 24 17:15:49.772776 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:49.772736 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:15:50.216684 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:50.216597 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" path="/var/lib/kubelet/pods/a6e06f3d-05a9-4fc7-afcc-af85858e01f1/volumes" Apr 24 17:15:54.776895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:54.776862 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:15:54.777531 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:15:54.777494 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:16:04.777870 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:16:04.777823 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:16:14.777673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:16:14.777630 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:16:24.777597 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:16:24.777506 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:16:34.777957 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:16:34.777913 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:16:44.777700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:16:44.777655 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:16:54.777580 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:16:54.777538 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:17:04.778467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:04.778434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:17:13.460750 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.460707 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf"] Apr 24 17:17:13.461152 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.461061 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" containerID="cri-o://add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec" gracePeriod=30 Apr 24 17:17:13.461220 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.461122 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kube-rbac-proxy" containerID="cri-o://620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9" gracePeriod=30 Apr 24 17:17:13.589436 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589404 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg"] Apr 24 17:17:13.589716 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589702 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="storage-initializer" Apr 24 17:17:13.589762 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589718 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="storage-initializer" Apr 24 17:17:13.589762 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589741 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" Apr 24 17:17:13.589762 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589747 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" Apr 24 17:17:13.589762 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589754 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kube-rbac-proxy" Apr 24 17:17:13.589762 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589760 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kube-rbac-proxy" Apr 24 17:17:13.589906 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589805 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kserve-container" Apr 24 17:17:13.589906 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.589813 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6e06f3d-05a9-4fc7-afcc-af85858e01f1" containerName="kube-rbac-proxy" Apr 24 17:17:13.592913 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.592891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.595125 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.595098 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 24 17:17:13.595125 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.595129 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 17:17:13.601673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.601648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg"] Apr 24 17:17:13.641240 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.641202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhphv\" (UniqueName: \"kubernetes.io/projected/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kube-api-access-xhphv\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.641458 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.641257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.641458 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.641358 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.641458 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.641391 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.742486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.742448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.742673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.742516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.742673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.742563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.742673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.742599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhphv\" (UniqueName: \"kubernetes.io/projected/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kube-api-access-xhphv\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.742673 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:17:13.742622 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-serving-cert: secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 24 17:17:13.742900 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:17:13.742703 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls podName:c3fc3f20-677d-4691-bdca-ea2c32ba71cf nodeName:}" failed. No retries permitted until 2026-04-24 17:17:14.242681585 +0000 UTC m=+2292.507396960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls") pod "isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" (UID: "c3fc3f20-677d-4691-bdca-ea2c32ba71cf") : secret "isvc-predictive-lightgbm-predictor-serving-cert" not found Apr 24 17:17:13.743001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.742981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.743179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.743160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:13.753236 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:13.753206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhphv\" (UniqueName: \"kubernetes.io/projected/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kube-api-access-xhphv\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:14.017008 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.016920 2573 generic.go:358] "Generic (PLEG): container finished" podID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerID="620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9" exitCode=2 Apr 24 17:17:14.017008 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.016975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerDied","Data":"620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9"} Apr 24 17:17:14.246335 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.246262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:14.248810 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.248780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-52psg\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:14.504275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.504233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:14.633250 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.633067 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg"] Apr 24 17:17:14.636432 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:17:14.636396 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fc3f20_677d_4691_bdca_ea2c32ba71cf.slice/crio-5cf4c77bcd6173104bc4e595ac611dab6cee20ca5a4d0c668b3b97712fd3716d WatchSource:0}: Error finding container 5cf4c77bcd6173104bc4e595ac611dab6cee20ca5a4d0c668b3b97712fd3716d: Status 404 returned error can't find the container with id 5cf4c77bcd6173104bc4e595ac611dab6cee20ca5a4d0c668b3b97712fd3716d Apr 24 17:17:14.638649 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.638631 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:17:14.773602 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.773492 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.41:8643/healthz\": dial tcp 10.134.0.41:8643: connect: connection refused" Apr 24 17:17:14.777968 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:14.777927 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 24 17:17:15.022019 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:15.021978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerStarted","Data":"9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4"} Apr 24 17:17:15.022019 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:15.022017 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerStarted","Data":"5cf4c77bcd6173104bc4e595ac611dab6cee20ca5a4d0c668b3b97712fd3716d"} Apr 24 17:17:18.902352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.902325 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:17:18.985743 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.985700 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " Apr 24 17:17:18.985937 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.985802 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kserve-provision-location\") pod \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " Apr 24 17:17:18.985937 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.985839 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-proxy-tls\") pod \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " Apr 24 17:17:18.985937 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.985868 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgtx9\" (UniqueName: \"kubernetes.io/projected/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kube-api-access-hgtx9\") pod \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\" (UID: \"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46\") " Apr 24 17:17:18.986125 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.986096 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" (UID: "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:17:18.986210 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.986157 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" (UID: "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:17:18.988081 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.988053 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kube-api-access-hgtx9" (OuterVolumeSpecName: "kube-api-access-hgtx9") pod "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" (UID: "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46"). InnerVolumeSpecName "kube-api-access-hgtx9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:17:18.988255 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:18.988234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" (UID: "9af3c6b6-2a66-4db5-a7fd-2aad0e926c46"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:17:19.037710 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.037669 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerID="9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4" exitCode=0 Apr 24 17:17:19.037886 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.037748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerDied","Data":"9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4"} Apr 24 17:17:19.039611 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.039575 2573 generic.go:358] "Generic (PLEG): container finished" podID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerID="add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec" exitCode=0 Apr 24 17:17:19.039728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.039629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerDied","Data":"add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec"} Apr 24 17:17:19.039728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.039646 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" Apr 24 17:17:19.039728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.039663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf" event={"ID":"9af3c6b6-2a66-4db5-a7fd-2aad0e926c46","Type":"ContainerDied","Data":"29077fc21c76d0536cc7891e8e079dddd4970a0f9450f7d1b2f250f6b0620403"} Apr 24 17:17:19.039728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.039685 2573 scope.go:117] "RemoveContainer" containerID="620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9" Apr 24 17:17:19.048516 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.048493 2573 scope.go:117] "RemoveContainer" containerID="add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec" Apr 24 17:17:19.059124 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.059100 2573 scope.go:117] "RemoveContainer" containerID="9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0" Apr 24 17:17:19.067408 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.067329 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf"] Apr 24 17:17:19.067486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.067445 2573 scope.go:117] "RemoveContainer" containerID="620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9" Apr 24 17:17:19.067797 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:17:19.067776 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9\": container with ID starting with 620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9 not found: ID does not exist" containerID="620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9" Apr 24 17:17:19.067883 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.067815 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9"} err="failed to get container status \"620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9\": rpc error: code = NotFound desc = could not find container \"620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9\": container with ID starting with 620d93f9a2f6a269f38a152f50c69931b52de5b2413bb223c436cc29ce8721b9 not found: ID does not exist" Apr 24 17:17:19.067883 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.067834 2573 scope.go:117] "RemoveContainer" containerID="add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec" Apr 24 17:17:19.068104 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:17:19.068088 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec\": container with ID starting with add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec not found: ID does not exist" containerID="add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec" Apr 24 17:17:19.068144 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.068111 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec"} err="failed to get container status \"add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec\": rpc error: code = NotFound desc = could not find container \"add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec\": container with ID starting with add76cc10330ab830c5c472d7387a1cb8a78c85f91acf6ee1ffa62038b027aec not found: ID does not exist" Apr 24 17:17:19.068144 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.068127 2573 scope.go:117] "RemoveContainer" containerID="9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0" Apr 24 17:17:19.068611 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:17:19.068585 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0\": container with ID starting with 9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0 not found: ID does not exist" containerID="9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0" Apr 24 17:17:19.068728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.068618 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0"} err="failed to get container status \"9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0\": rpc error: code = NotFound desc = could not find container \"9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0\": container with ID starting with 9330dfd69711c421b6e168a2ded552627a7ab884993e93e568e34de89fd803f0 not found: ID does not exist" Apr 24 17:17:19.070224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.070199 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-sbjzf"] Apr 24 17:17:19.087484 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.087445 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:17:19.087484 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.087471 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:17:19.087484 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.087484 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:17:19.087742 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:19.087498 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hgtx9\" (UniqueName: \"kubernetes.io/projected/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46-kube-api-access-hgtx9\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:17:20.045082 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:20.045043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerStarted","Data":"7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc"} Apr 24 17:17:20.045082 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:20.045085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerStarted","Data":"a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c"} Apr 24 17:17:20.045645 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:20.045289 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:20.063296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:20.063244 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podStartSLOduration=7.063227968 podStartE2EDuration="7.063227968s" podCreationTimestamp="2026-04-24 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:17:20.061795595 +0000 UTC m=+2298.326510994" watchObservedRunningTime="2026-04-24 17:17:20.063227968 +0000 UTC m=+2298.327943366" Apr 24 17:17:20.217005 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:20.216971 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" path="/var/lib/kubelet/pods/9af3c6b6-2a66-4db5-a7fd-2aad0e926c46/volumes" Apr 24 17:17:21.047803 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:21.047771 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:21.049055 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:21.049025 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:17:22.050054 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:22.050010 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:17:27.054673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:27.054640 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:17:27.055241 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:27.055211 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:17:37.056001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:37.055960 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:17:47.055481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:47.055441 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:17:57.056165 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:17:57.056067 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:18:07.055458 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:07.055408 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:18:17.055901 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:17.055857 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:18:27.055451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:27.055408 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:18:37.056482 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:37.056450 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:18:43.699971 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.699933 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg"] Apr 24 17:18:43.700412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.700264 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" containerID="cri-o://a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c" gracePeriod=30 Apr 24 17:18:43.700412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.700334 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kube-rbac-proxy" containerID="cri-o://7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc" gracePeriod=30 Apr 24 17:18:43.821463 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821429 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8"] Apr 24 17:18:43.821740 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821725 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="storage-initializer" Apr 24 17:18:43.821740 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821740 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="storage-initializer" Apr 24 17:18:43.821828 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821754 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" Apr 24 17:18:43.821828 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821759 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" Apr 24 17:18:43.821828 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821767 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kube-rbac-proxy" Apr 24 17:18:43.821828 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821773 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kube-rbac-proxy" Apr 24 17:18:43.821828 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821822 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kserve-container" Apr 24 17:18:43.822047 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.821835 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9af3c6b6-2a66-4db5-a7fd-2aad0e926c46" containerName="kube-rbac-proxy" Apr 24 17:18:43.824691 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.824666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.827171 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.827068 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 17:18:43.827298 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.827218 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 24 17:18:43.833685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.833656 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8"] Apr 24 17:18:43.887873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.887828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.887873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.887875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzht\" (UniqueName: \"kubernetes.io/projected/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kube-api-access-wtzht\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.888137 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.887958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.888137 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.888009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.989350 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.989281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.989547 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.989358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtzht\" (UniqueName: \"kubernetes.io/projected/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kube-api-access-wtzht\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.989547 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.989396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.989547 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.989419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.989852 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.989833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.990143 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.990123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.991896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.991867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:43.997545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:43.997520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtzht\" (UniqueName: \"kubernetes.io/projected/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kube-api-access-wtzht\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:44.137783 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:44.137740 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:44.267841 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:44.267813 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8"] Apr 24 17:18:44.270456 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:18:44.270427 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0323fa27_e3b8_4ff6_bce1_99d2db80daa2.slice/crio-b878ccf3274ba063bd96760eeeec952300914f5e5e6b769c334d81849a25fd4d WatchSource:0}: Error finding container b878ccf3274ba063bd96760eeeec952300914f5e5e6b769c334d81849a25fd4d: Status 404 returned error can't find the container with id b878ccf3274ba063bd96760eeeec952300914f5e5e6b769c334d81849a25fd4d Apr 24 17:18:44.315770 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:44.315734 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerID="7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc" exitCode=2 Apr 24 17:18:44.315943 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:44.315812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerDied","Data":"7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc"} Apr 24 17:18:44.316934 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:44.316914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerStarted","Data":"b878ccf3274ba063bd96760eeeec952300914f5e5e6b769c334d81849a25fd4d"} Apr 24 17:18:45.321602 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:45.321563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerStarted","Data":"162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7"} Apr 24 17:18:47.051248 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:47.051197 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.42:8643/healthz\": dial tcp 10.134.0.42:8643: connect: connection refused" Apr 24 17:18:47.055694 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:47.055661 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 24 17:18:48.333838 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:48.333803 2573 generic.go:358] "Generic (PLEG): container finished" podID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerID="162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7" exitCode=0 Apr 24 17:18:48.334275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:48.333881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerDied","Data":"162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7"} Apr 24 17:18:49.339078 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:49.339040 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerStarted","Data":"0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796"} Apr 24 17:18:49.339078 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:49.339083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerStarted","Data":"6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36"} Apr 24 17:18:49.339568 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:49.339430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:49.339568 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:49.339473 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:18:49.357131 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:49.357070 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podStartSLOduration=6.3570536650000005 podStartE2EDuration="6.357053665s" podCreationTimestamp="2026-04-24 17:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:18:49.355350527 +0000 UTC m=+2387.620065922" watchObservedRunningTime="2026-04-24 17:18:49.357053665 +0000 UTC m=+2387.621769042" Apr 24 17:18:50.248856 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.248828 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:18:50.340169 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.340134 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " Apr 24 17:18:50.340707 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.340197 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhphv\" (UniqueName: \"kubernetes.io/projected/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kube-api-access-xhphv\") pod \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " Apr 24 17:18:50.340707 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.340227 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls\") pod \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " Apr 24 17:18:50.340707 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.340257 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kserve-provision-location\") pod \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\" (UID: \"c3fc3f20-677d-4691-bdca-ea2c32ba71cf\") " Apr 24 17:18:50.340707 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.340649 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "c3fc3f20-677d-4691-bdca-ea2c32ba71cf" (UID: "c3fc3f20-677d-4691-bdca-ea2c32ba71cf"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:18:50.340707 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.340661 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c3fc3f20-677d-4691-bdca-ea2c32ba71cf" (UID: "c3fc3f20-677d-4691-bdca-ea2c32ba71cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:18:50.342663 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.342628 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kube-api-access-xhphv" (OuterVolumeSpecName: "kube-api-access-xhphv") pod "c3fc3f20-677d-4691-bdca-ea2c32ba71cf" (UID: "c3fc3f20-677d-4691-bdca-ea2c32ba71cf"). InnerVolumeSpecName "kube-api-access-xhphv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:18:50.342905 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.342873 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c3fc3f20-677d-4691-bdca-ea2c32ba71cf" (UID: "c3fc3f20-677d-4691-bdca-ea2c32ba71cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:18:50.344982 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.344956 2573 generic.go:358] "Generic (PLEG): container finished" podID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerID="a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c" exitCode=0 Apr 24 17:18:50.345072 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.345032 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerDied","Data":"a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c"} Apr 24 17:18:50.345127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.345069 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" Apr 24 17:18:50.345127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.345077 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg" event={"ID":"c3fc3f20-677d-4691-bdca-ea2c32ba71cf","Type":"ContainerDied","Data":"5cf4c77bcd6173104bc4e595ac611dab6cee20ca5a4d0c668b3b97712fd3716d"} Apr 24 17:18:50.345127 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.345103 2573 scope.go:117] "RemoveContainer" containerID="7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc" Apr 24 17:18:50.360927 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.360906 2573 scope.go:117] "RemoveContainer" containerID="a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c" Apr 24 17:18:50.368789 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.368765 2573 scope.go:117] "RemoveContainer" containerID="9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4" Apr 24 17:18:50.377789 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.377751 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg"] Apr 24 17:18:50.379285 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.379265 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-52psg"] Apr 24 17:18:50.382832 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.382811 2573 scope.go:117] "RemoveContainer" containerID="7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc" Apr 24 17:18:50.383155 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:18:50.383134 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc\": container with ID starting with 7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc not found: ID does not exist" containerID="7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc" Apr 24 17:18:50.383204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.383167 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc"} err="failed to get container status \"7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc\": rpc error: code = NotFound desc = could not find container \"7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc\": container with ID starting with 7fa46a57603e5d73b37f865bd9d0b3ad8daf35cb48fd6e425e872394fbabafcc not found: ID does not exist" Apr 24 17:18:50.383204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.383187 2573 scope.go:117] "RemoveContainer" containerID="a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c" Apr 24 17:18:50.383476 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:18:50.383458 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c\": container with ID starting with a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c not found: ID does not exist" containerID="a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c" Apr 24 17:18:50.383520 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.383482 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c"} err="failed to get container status \"a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c\": rpc error: code = NotFound desc = could not find container \"a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c\": container with ID starting with a897c5411d92d7d7e3c755ecfba057278fdb7c9390f6369953f99dd171c3ce2c not found: ID does not exist" Apr 24 17:18:50.383520 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.383499 2573 scope.go:117] "RemoveContainer" containerID="9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4" Apr 24 17:18:50.383724 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:18:50.383707 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4\": container with ID starting with 9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4 not found: ID does not exist" containerID="9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4" Apr 24 17:18:50.383765 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.383728 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4"} err="failed to get container status \"9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4\": rpc error: code = NotFound desc = could not find container \"9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4\": container with ID starting with 9bbea1a870e48b4799754ed4e87aeff9a76d611e21712bda548ce005c61ea5e4 not found: ID does not exist" Apr 24 17:18:50.441330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.441262 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:18:50.441330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.441333 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xhphv\" (UniqueName: \"kubernetes.io/projected/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kube-api-access-xhphv\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:18:50.441592 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.441351 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:18:50.441592 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:50.441363 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3fc3f20-677d-4691-bdca-ea2c32ba71cf-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:18:52.217710 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:52.217675 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" path="/var/lib/kubelet/pods/c3fc3f20-677d-4691-bdca-ea2c32ba71cf/volumes" Apr 24 17:18:55.351135 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:18:55.351107 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:19:02.264391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:19:02.264343 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:19:02.274169 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:19:02.274138 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:19:25.351867 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:19:25.351778 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 17:19:35.352515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:19:35.352469 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 17:19:45.351785 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:19:45.351738 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 17:19:55.352652 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:19:55.352605 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 17:20:05.355497 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:05.355461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:20:13.913616 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:13.913580 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8"] Apr 24 17:20:13.914220 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:13.914022 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" containerID="cri-o://6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36" gracePeriod=30 Apr 24 17:20:13.914220 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:13.914056 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kube-rbac-proxy" containerID="cri-o://0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796" gracePeriod=30 Apr 24 17:20:14.048882 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.048846 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm"] Apr 24 17:20:14.049186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049171 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" Apr 24 17:20:14.049231 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049188 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" Apr 24 17:20:14.049231 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049201 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="storage-initializer" Apr 24 17:20:14.049231 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049207 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="storage-initializer" Apr 24 17:20:14.049231 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049213 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kube-rbac-proxy" Apr 24 17:20:14.049231 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049219 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kube-rbac-proxy" Apr 24 17:20:14.049423 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049275 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kserve-container" Apr 24 17:20:14.049423 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.049284 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3fc3f20-677d-4691-bdca-ea2c32ba71cf" containerName="kube-rbac-proxy" Apr 24 17:20:14.052235 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.052209 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.054412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.054371 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 24 17:20:14.054570 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.054378 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 17:20:14.060984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.060955 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm"] Apr 24 17:20:14.127742 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.127686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm8nc\" (UniqueName: \"kubernetes.io/projected/3dab5094-5830-4f9e-b9d0-a2df490dc372-kube-api-access-fm8nc\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.127742 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.127744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dab5094-5830-4f9e-b9d0-a2df490dc372-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.128054 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.127862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dab5094-5830-4f9e-b9d0-a2df490dc372-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.128054 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.127904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dab5094-5830-4f9e-b9d0-a2df490dc372-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.229357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.229227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dab5094-5830-4f9e-b9d0-a2df490dc372-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.229357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.229272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dab5094-5830-4f9e-b9d0-a2df490dc372-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.229357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.229316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm8nc\" (UniqueName: \"kubernetes.io/projected/3dab5094-5830-4f9e-b9d0-a2df490dc372-kube-api-access-fm8nc\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.229357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.229334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dab5094-5830-4f9e-b9d0-a2df490dc372-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.229795 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.229773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dab5094-5830-4f9e-b9d0-a2df490dc372-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.230006 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.229981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dab5094-5830-4f9e-b9d0-a2df490dc372-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.232018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.231991 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dab5094-5830-4f9e-b9d0-a2df490dc372-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.237542 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.237519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm8nc\" (UniqueName: \"kubernetes.io/projected/3dab5094-5830-4f9e-b9d0-a2df490dc372-kube-api-access-fm8nc\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.364587 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.364548 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:14.493929 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.493901 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm"] Apr 24 17:20:14.496432 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:20:14.496402 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dab5094_5830_4f9e_b9d0_a2df490dc372.slice/crio-194f765eb8d27eef40d9509d1d7e1ebc78f6a849483dcfbcf42e6a9a4272b6d9 WatchSource:0}: Error finding container 194f765eb8d27eef40d9509d1d7e1ebc78f6a849483dcfbcf42e6a9a4272b6d9: Status 404 returned error can't find the container with id 194f765eb8d27eef40d9509d1d7e1ebc78f6a849483dcfbcf42e6a9a4272b6d9 Apr 24 17:20:14.597471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.597435 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerStarted","Data":"9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7"} Apr 24 17:20:14.597471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.597475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerStarted","Data":"194f765eb8d27eef40d9509d1d7e1ebc78f6a849483dcfbcf42e6a9a4272b6d9"} Apr 24 17:20:14.599489 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.599456 2573 generic.go:358] "Generic (PLEG): container finished" podID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerID="0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796" exitCode=2 Apr 24 17:20:14.599622 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:14.599513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerDied","Data":"0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796"} Apr 24 17:20:15.345613 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:15.345567 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.43:8643/healthz\": dial tcp 10.134.0.43:8643: connect: connection refused" Apr 24 17:20:15.352813 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:15.352777 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.43:8080: connect: connection refused" Apr 24 17:20:18.614032 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:18.613996 2573 generic.go:358] "Generic (PLEG): container finished" podID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerID="9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7" exitCode=0 Apr 24 17:20:18.614456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:18.614048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerDied","Data":"9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7"} Apr 24 17:20:19.456238 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.456207 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:20:19.471713 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.471676 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-proxy-tls\") pod \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " Apr 24 17:20:19.471904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.471755 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtzht\" (UniqueName: \"kubernetes.io/projected/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kube-api-access-wtzht\") pod \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " Apr 24 17:20:19.471904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.471818 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " Apr 24 17:20:19.471904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.471846 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kserve-provision-location\") pod \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\" (UID: \"0323fa27-e3b8-4ff6-bce1-99d2db80daa2\") " Apr 24 17:20:19.472218 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.472191 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0323fa27-e3b8-4ff6-bce1-99d2db80daa2" (UID: "0323fa27-e3b8-4ff6-bce1-99d2db80daa2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:20:19.472299 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.472212 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "0323fa27-e3b8-4ff6-bce1-99d2db80daa2" (UID: "0323fa27-e3b8-4ff6-bce1-99d2db80daa2"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:20:19.474022 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.473984 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kube-api-access-wtzht" (OuterVolumeSpecName: "kube-api-access-wtzht") pod "0323fa27-e3b8-4ff6-bce1-99d2db80daa2" (UID: "0323fa27-e3b8-4ff6-bce1-99d2db80daa2"). InnerVolumeSpecName "kube-api-access-wtzht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:20:19.474504 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.474163 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0323fa27-e3b8-4ff6-bce1-99d2db80daa2" (UID: "0323fa27-e3b8-4ff6-bce1-99d2db80daa2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:20:19.572873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.572827 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:20:19.572873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.572868 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:20:19.573161 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.572883 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:20:19.573161 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.572898 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtzht\" (UniqueName: \"kubernetes.io/projected/0323fa27-e3b8-4ff6-bce1-99d2db80daa2-kube-api-access-wtzht\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:20:19.619747 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.619703 2573 generic.go:358] "Generic (PLEG): container finished" podID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerID="6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36" exitCode=0 Apr 24 17:20:19.620213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.619801 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" Apr 24 17:20:19.620213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.619793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerDied","Data":"6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36"} Apr 24 17:20:19.620213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.619923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8" event={"ID":"0323fa27-e3b8-4ff6-bce1-99d2db80daa2","Type":"ContainerDied","Data":"b878ccf3274ba063bd96760eeeec952300914f5e5e6b769c334d81849a25fd4d"} Apr 24 17:20:19.620213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.619948 2573 scope.go:117] "RemoveContainer" containerID="0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796" Apr 24 17:20:19.622222 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.622174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerStarted","Data":"e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605"} Apr 24 17:20:19.622222 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.622220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerStarted","Data":"2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a"} Apr 24 17:20:19.622509 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.622488 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:19.622600 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.622520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:19.636136 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.636104 2573 scope.go:117] "RemoveContainer" containerID="6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36" Apr 24 17:20:19.645016 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.644994 2573 scope.go:117] "RemoveContainer" containerID="162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7" Apr 24 17:20:19.645567 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.645520 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podStartSLOduration=5.645504036 podStartE2EDuration="5.645504036s" podCreationTimestamp="2026-04-24 17:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:20:19.643748413 +0000 UTC m=+2477.908463811" watchObservedRunningTime="2026-04-24 17:20:19.645504036 +0000 UTC m=+2477.910219435" Apr 24 17:20:19.652852 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.652833 2573 scope.go:117] "RemoveContainer" containerID="0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796" Apr 24 17:20:19.653152 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:20:19.653129 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796\": container with ID starting with 0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796 not found: ID does not exist" containerID="0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796" Apr 24 17:20:19.653195 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.653164 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796"} err="failed to get container status \"0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796\": rpc error: code = NotFound desc = could not find container \"0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796\": container with ID starting with 0b1df82c36f6afda90bb5822d7eb7676ab1dce7a21a4405736a094a9a227b796 not found: ID does not exist" Apr 24 17:20:19.653195 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.653184 2573 scope.go:117] "RemoveContainer" containerID="6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36" Apr 24 17:20:19.653471 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:20:19.653453 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36\": container with ID starting with 6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36 not found: ID does not exist" containerID="6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36" Apr 24 17:20:19.653543 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.653477 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36"} err="failed to get container status \"6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36\": rpc error: code = NotFound desc = could not find container \"6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36\": container with ID starting with 6b1d3c58b9915d008fe87d4b813b89014deca381d74b73791169360e17570a36 not found: ID does not exist" Apr 24 17:20:19.653543 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.653492 2573 scope.go:117] "RemoveContainer" containerID="162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7" Apr 24 17:20:19.653717 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:20:19.653691 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7\": container with ID starting with 162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7 not found: ID does not exist" containerID="162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7" Apr 24 17:20:19.653775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.653722 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7"} err="failed to get container status \"162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7\": rpc error: code = NotFound desc = could not find container \"162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7\": container with ID starting with 162035db105c38272a2051210ee58378f424deddc8471b435e6e3cb4ea9beda7 not found: ID does not exist" Apr 24 17:20:19.655748 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.655722 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8"] Apr 24 17:20:19.662166 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:19.662136 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-c2lh8"] Apr 24 17:20:20.216464 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:20.216429 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" path="/var/lib/kubelet/pods/0323fa27-e3b8-4ff6-bce1-99d2db80daa2/volumes" Apr 24 17:20:25.632054 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:25.632016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:20:55.633627 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:20:55.633538 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 17:21:05.632776 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:05.632732 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 17:21:15.632855 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:15.632803 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 17:21:25.633061 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:25.633012 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 17:21:35.636186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:35.636145 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:21:44.149456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.149417 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm"] Apr 24 17:21:44.149968 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.149833 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" containerID="cri-o://2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a" gracePeriod=30 Apr 24 17:21:44.150061 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.150033 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kube-rbac-proxy" containerID="cri-o://e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605" gracePeriod=30 Apr 24 17:21:44.249373 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249329 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2"] Apr 24 17:21:44.249642 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249630 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="storage-initializer" Apr 24 17:21:44.249700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249644 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="storage-initializer" Apr 24 17:21:44.249700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249659 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" Apr 24 17:21:44.249700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249665 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" Apr 24 17:21:44.249700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249674 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kube-rbac-proxy" Apr 24 17:21:44.249700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249679 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kube-rbac-proxy" Apr 24 17:21:44.249878 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249721 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kserve-container" Apr 24 17:21:44.249878 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.249730 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0323fa27-e3b8-4ff6-bce1-99d2db80daa2" containerName="kube-rbac-proxy" Apr 24 17:21:44.252853 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.252830 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.254988 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.254962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 24 17:21:44.255259 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.255232 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 24 17:21:44.264046 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.264016 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2"] Apr 24 17:21:44.407290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.407188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8x7r\" (UniqueName: \"kubernetes.io/projected/15d8fc68-b719-487e-9786-5e5156bfbde4-kube-api-access-m8x7r\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.407290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.407244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15d8fc68-b719-487e-9786-5e5156bfbde4-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.407515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.407358 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15d8fc68-b719-487e-9786-5e5156bfbde4-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.407515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.407405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15d8fc68-b719-487e-9786-5e5156bfbde4-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.508534 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.508494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15d8fc68-b719-487e-9786-5e5156bfbde4-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.508534 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.508546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15d8fc68-b719-487e-9786-5e5156bfbde4-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.508825 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.508596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8x7r\" (UniqueName: \"kubernetes.io/projected/15d8fc68-b719-487e-9786-5e5156bfbde4-kube-api-access-m8x7r\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.508825 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.508631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15d8fc68-b719-487e-9786-5e5156bfbde4-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.508991 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.508961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15d8fc68-b719-487e-9786-5e5156bfbde4-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.509275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.509256 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15d8fc68-b719-487e-9786-5e5156bfbde4-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.511330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.511279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15d8fc68-b719-487e-9786-5e5156bfbde4-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.517172 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.517140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8x7r\" (UniqueName: \"kubernetes.io/projected/15d8fc68-b719-487e-9786-5e5156bfbde4-kube-api-access-m8x7r\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.565571 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.565518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:44.698790 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.698762 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2"] Apr 24 17:21:44.701610 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:21:44.701579 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d8fc68_b719_487e_9786_5e5156bfbde4.slice/crio-b56e694d39d769fd464d811dd4ec1924d9459c33aeb6949f9b45555d2641c95c WatchSource:0}: Error finding container b56e694d39d769fd464d811dd4ec1924d9459c33aeb6949f9b45555d2641c95c: Status 404 returned error can't find the container with id b56e694d39d769fd464d811dd4ec1924d9459c33aeb6949f9b45555d2641c95c Apr 24 17:21:44.874431 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.874386 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerStarted","Data":"3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26"} Apr 24 17:21:44.874636 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.874440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerStarted","Data":"b56e694d39d769fd464d811dd4ec1924d9459c33aeb6949f9b45555d2641c95c"} Apr 24 17:21:44.876430 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.876401 2573 generic.go:358] "Generic (PLEG): container finished" podID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerID="e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605" exitCode=2 Apr 24 17:21:44.876542 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:44.876439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerDied","Data":"e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605"} Apr 24 17:21:45.627084 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:45.627034 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.44:8643/healthz\": dial tcp 10.134.0.44:8643: connect: connection refused" Apr 24 17:21:45.633013 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:45.632971 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.44:8080: connect: connection refused" Apr 24 17:21:48.891290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:48.891201 2573 generic.go:358] "Generic (PLEG): container finished" podID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerID="3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26" exitCode=0 Apr 24 17:21:48.891290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:48.891248 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerDied","Data":"3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26"} Apr 24 17:21:49.490436 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.490403 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:21:49.657214 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.657126 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dab5094-5830-4f9e-b9d0-a2df490dc372-kserve-provision-location\") pod \"3dab5094-5830-4f9e-b9d0-a2df490dc372\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " Apr 24 17:21:49.657214 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.657176 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm8nc\" (UniqueName: \"kubernetes.io/projected/3dab5094-5830-4f9e-b9d0-a2df490dc372-kube-api-access-fm8nc\") pod \"3dab5094-5830-4f9e-b9d0-a2df490dc372\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " Apr 24 17:21:49.657214 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.657195 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dab5094-5830-4f9e-b9d0-a2df490dc372-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"3dab5094-5830-4f9e-b9d0-a2df490dc372\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " Apr 24 17:21:49.657545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.657228 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dab5094-5830-4f9e-b9d0-a2df490dc372-proxy-tls\") pod \"3dab5094-5830-4f9e-b9d0-a2df490dc372\" (UID: \"3dab5094-5830-4f9e-b9d0-a2df490dc372\") " Apr 24 17:21:49.657545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.657502 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dab5094-5830-4f9e-b9d0-a2df490dc372-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3dab5094-5830-4f9e-b9d0-a2df490dc372" (UID: "3dab5094-5830-4f9e-b9d0-a2df490dc372"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:21:49.657657 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.657601 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dab5094-5830-4f9e-b9d0-a2df490dc372-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "3dab5094-5830-4f9e-b9d0-a2df490dc372" (UID: "3dab5094-5830-4f9e-b9d0-a2df490dc372"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:21:49.659573 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.659550 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dab5094-5830-4f9e-b9d0-a2df490dc372-kube-api-access-fm8nc" (OuterVolumeSpecName: "kube-api-access-fm8nc") pod "3dab5094-5830-4f9e-b9d0-a2df490dc372" (UID: "3dab5094-5830-4f9e-b9d0-a2df490dc372"). InnerVolumeSpecName "kube-api-access-fm8nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:21:49.659670 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.659605 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dab5094-5830-4f9e-b9d0-a2df490dc372-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3dab5094-5830-4f9e-b9d0-a2df490dc372" (UID: "3dab5094-5830-4f9e-b9d0-a2df490dc372"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:21:49.758843 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.758797 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dab5094-5830-4f9e-b9d0-a2df490dc372-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:21:49.758843 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.758838 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fm8nc\" (UniqueName: \"kubernetes.io/projected/3dab5094-5830-4f9e-b9d0-a2df490dc372-kube-api-access-fm8nc\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:21:49.758843 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.758853 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dab5094-5830-4f9e-b9d0-a2df490dc372-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:21:49.759102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.758869 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dab5094-5830-4f9e-b9d0-a2df490dc372-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:21:49.896556 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.896519 2573 generic.go:358] "Generic (PLEG): container finished" podID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerID="2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a" exitCode=0 Apr 24 17:21:49.896999 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.896607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerDied","Data":"2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a"} Apr 24 17:21:49.896999 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.896618 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" Apr 24 17:21:49.896999 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.896648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm" event={"ID":"3dab5094-5830-4f9e-b9d0-a2df490dc372","Type":"ContainerDied","Data":"194f765eb8d27eef40d9509d1d7e1ebc78f6a849483dcfbcf42e6a9a4272b6d9"} Apr 24 17:21:49.896999 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.896666 2573 scope.go:117] "RemoveContainer" containerID="e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605" Apr 24 17:21:49.898701 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.898671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerStarted","Data":"320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6"} Apr 24 17:21:49.898791 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.898710 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerStarted","Data":"02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3"} Apr 24 17:21:49.898963 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.898944 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:49.899038 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.898977 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:21:49.905507 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.905442 2573 scope.go:117] "RemoveContainer" containerID="2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a" Apr 24 17:21:49.913292 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.913270 2573 scope.go:117] "RemoveContainer" containerID="9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7" Apr 24 17:21:49.920161 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.920116 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podStartSLOduration=5.920099555 podStartE2EDuration="5.920099555s" podCreationTimestamp="2026-04-24 17:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:21:49.917798037 +0000 UTC m=+2568.182513434" watchObservedRunningTime="2026-04-24 17:21:49.920099555 +0000 UTC m=+2568.184814953" Apr 24 17:21:49.921472 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.921456 2573 scope.go:117] "RemoveContainer" containerID="e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605" Apr 24 17:21:49.921732 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:21:49.921716 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605\": container with ID starting with e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605 not found: ID does not exist" containerID="e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605" Apr 24 17:21:49.921787 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.921739 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605"} err="failed to get container status \"e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605\": rpc error: code = NotFound desc = could not find container \"e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605\": container with ID starting with e5448dcef0a3d7153a7678f8fbf5f6bd83c7d489df23c4e35b9881696ff28605 not found: ID does not exist" Apr 24 17:21:49.921787 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.921755 2573 scope.go:117] "RemoveContainer" containerID="2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a" Apr 24 17:21:49.921980 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:21:49.921965 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a\": container with ID starting with 2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a not found: ID does not exist" containerID="2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a" Apr 24 17:21:49.922018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.921983 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a"} err="failed to get container status \"2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a\": rpc error: code = NotFound desc = could not find container \"2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a\": container with ID starting with 2a0816fc8a72e393a6839f6803deefa1b46b787f42e57155b829521cc0cb8e3a not found: ID does not exist" Apr 24 17:21:49.922018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.921996 2573 scope.go:117] "RemoveContainer" containerID="9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7" Apr 24 17:21:49.922196 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:21:49.922171 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7\": container with ID starting with 9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7 not found: ID does not exist" containerID="9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7" Apr 24 17:21:49.922265 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.922197 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7"} err="failed to get container status \"9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7\": rpc error: code = NotFound desc = could not find container \"9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7\": container with ID starting with 9b0cffdecae5a3e938667158b2110a14c0fc3c28b90993a09428855530071cb7 not found: ID does not exist" Apr 24 17:21:49.931904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.931873 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm"] Apr 24 17:21:49.935585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:49.935556 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-fclkm"] Apr 24 17:21:50.217924 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:50.217828 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" path="/var/lib/kubelet/pods/3dab5094-5830-4f9e-b9d0-a2df490dc372/volumes" Apr 24 17:21:55.910454 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:21:55.910422 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:22:25.911940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:22:25.911845 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 17:22:35.911399 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:22:35.911345 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 17:22:45.911048 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:22:45.911002 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 17:22:55.911328 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:22:55.911263 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 17:22:57.212662 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:22:57.212612 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 17:23:07.216267 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:07.216234 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:23:14.348235 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:14.348184 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2"] Apr 24 17:23:14.348699 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:14.348556 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" containerID="cri-o://02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3" gracePeriod=30 Apr 24 17:23:14.348931 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:14.348891 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kube-rbac-proxy" containerID="cri-o://320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6" gracePeriod=30 Apr 24 17:23:15.151497 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:15.151461 2573 generic.go:358] "Generic (PLEG): container finished" podID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerID="320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6" exitCode=2 Apr 24 17:23:15.151691 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:15.151546 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerDied","Data":"320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6"} Apr 24 17:23:15.904138 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:15.904083 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.45:8643/healthz\": dial tcp 10.134.0.45:8643: connect: connection refused" Apr 24 17:23:16.527069 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527028 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl"] Apr 24 17:23:16.527432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527407 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kube-rbac-proxy" Apr 24 17:23:16.527432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527428 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kube-rbac-proxy" Apr 24 17:23:16.527590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527447 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" Apr 24 17:23:16.527590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527453 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" Apr 24 17:23:16.527590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527466 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="storage-initializer" Apr 24 17:23:16.527590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527473 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="storage-initializer" Apr 24 17:23:16.527590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527534 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kserve-container" Apr 24 17:23:16.527590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.527543 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dab5094-5830-4f9e-b9d0-a2df490dc372" containerName="kube-rbac-proxy" Apr 24 17:23:16.530757 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.530737 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.532998 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.532965 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 24 17:23:16.533118 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.533010 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 17:23:16.539497 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.539471 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl"] Apr 24 17:23:16.565907 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.565865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d69f1970-ea88-482c-b826-8889e19df0bf-proxy-tls\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.565907 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.565909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d69f1970-ea88-482c-b826-8889e19df0bf-kserve-provision-location\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.566126 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.565930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d69f1970-ea88-482c-b826-8889e19df0bf-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.566126 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.566020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjcj\" (UniqueName: \"kubernetes.io/projected/d69f1970-ea88-482c-b826-8889e19df0bf-kube-api-access-gxjcj\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.666612 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.666580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d69f1970-ea88-482c-b826-8889e19df0bf-proxy-tls\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.666802 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.666619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d69f1970-ea88-482c-b826-8889e19df0bf-kserve-provision-location\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.666802 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.666649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d69f1970-ea88-482c-b826-8889e19df0bf-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.666802 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.666704 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjcj\" (UniqueName: \"kubernetes.io/projected/d69f1970-ea88-482c-b826-8889e19df0bf-kube-api-access-gxjcj\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.667069 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.667048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d69f1970-ea88-482c-b826-8889e19df0bf-kserve-provision-location\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.667400 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.667379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d69f1970-ea88-482c-b826-8889e19df0bf-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.669357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.669337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d69f1970-ea88-482c-b826-8889e19df0bf-proxy-tls\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.677748 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.677715 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjcj\" (UniqueName: \"kubernetes.io/projected/d69f1970-ea88-482c-b826-8889e19df0bf-kube-api-access-gxjcj\") pod \"isvc-sklearn-predictor-6cff579875-rbdsl\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.842116 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.841999 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:16.971202 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.971172 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl"] Apr 24 17:23:16.973927 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:23:16.973896 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69f1970_ea88_482c_b826_8889e19df0bf.slice/crio-f5e6143bd20ab358a01517e2fab769243fd99a8ce0d918895010a07f7e6ae7c1 WatchSource:0}: Error finding container f5e6143bd20ab358a01517e2fab769243fd99a8ce0d918895010a07f7e6ae7c1: Status 404 returned error can't find the container with id f5e6143bd20ab358a01517e2fab769243fd99a8ce0d918895010a07f7e6ae7c1 Apr 24 17:23:16.976204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:16.976186 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:23:17.160835 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:17.160735 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerStarted","Data":"8792432d03802c989f7c0cd484b90121ca6b57688cf71e8ec8c61fbf62d415c8"} Apr 24 17:23:17.160835 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:17.160773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerStarted","Data":"f5e6143bd20ab358a01517e2fab769243fd99a8ce0d918895010a07f7e6ae7c1"} Apr 24 17:23:17.212983 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:17.212922 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.45:8080: connect: connection refused" Apr 24 17:23:20.848427 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.848396 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:23:20.902775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.902728 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15d8fc68-b719-487e-9786-5e5156bfbde4-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"15d8fc68-b719-487e-9786-5e5156bfbde4\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " Apr 24 17:23:20.902775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.902781 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15d8fc68-b719-487e-9786-5e5156bfbde4-proxy-tls\") pod \"15d8fc68-b719-487e-9786-5e5156bfbde4\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " Apr 24 17:23:20.903001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.902816 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15d8fc68-b719-487e-9786-5e5156bfbde4-kserve-provision-location\") pod \"15d8fc68-b719-487e-9786-5e5156bfbde4\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " Apr 24 17:23:20.903001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.902841 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8x7r\" (UniqueName: \"kubernetes.io/projected/15d8fc68-b719-487e-9786-5e5156bfbde4-kube-api-access-m8x7r\") pod \"15d8fc68-b719-487e-9786-5e5156bfbde4\" (UID: \"15d8fc68-b719-487e-9786-5e5156bfbde4\") " Apr 24 17:23:20.903143 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.903118 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d8fc68-b719-487e-9786-5e5156bfbde4-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "15d8fc68-b719-487e-9786-5e5156bfbde4" (UID: "15d8fc68-b719-487e-9786-5e5156bfbde4"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:23:20.903249 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.903210 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15d8fc68-b719-487e-9786-5e5156bfbde4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15d8fc68-b719-487e-9786-5e5156bfbde4" (UID: "15d8fc68-b719-487e-9786-5e5156bfbde4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:23:20.905140 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.905117 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d8fc68-b719-487e-9786-5e5156bfbde4-kube-api-access-m8x7r" (OuterVolumeSpecName: "kube-api-access-m8x7r") pod "15d8fc68-b719-487e-9786-5e5156bfbde4" (UID: "15d8fc68-b719-487e-9786-5e5156bfbde4"). InnerVolumeSpecName "kube-api-access-m8x7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:23:20.905249 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:20.905195 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15d8fc68-b719-487e-9786-5e5156bfbde4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "15d8fc68-b719-487e-9786-5e5156bfbde4" (UID: "15d8fc68-b719-487e-9786-5e5156bfbde4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:23:21.003847 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.003811 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15d8fc68-b719-487e-9786-5e5156bfbde4-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:23:21.003847 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.003844 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8x7r\" (UniqueName: \"kubernetes.io/projected/15d8fc68-b719-487e-9786-5e5156bfbde4-kube-api-access-m8x7r\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:23:21.004053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.003857 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/15d8fc68-b719-487e-9786-5e5156bfbde4-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:23:21.004053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.003868 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15d8fc68-b719-487e-9786-5e5156bfbde4-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:23:21.175035 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.174999 2573 generic.go:358] "Generic (PLEG): container finished" podID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerID="02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3" exitCode=0 Apr 24 17:23:21.175229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.175064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerDied","Data":"02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3"} Apr 24 17:23:21.175229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.175090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" event={"ID":"15d8fc68-b719-487e-9786-5e5156bfbde4","Type":"ContainerDied","Data":"b56e694d39d769fd464d811dd4ec1924d9459c33aeb6949f9b45555d2641c95c"} Apr 24 17:23:21.175229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.175092 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2" Apr 24 17:23:21.175229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.175109 2573 scope.go:117] "RemoveContainer" containerID="320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6" Apr 24 17:23:21.176614 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.176589 2573 generic.go:358] "Generic (PLEG): container finished" podID="d69f1970-ea88-482c-b826-8889e19df0bf" containerID="8792432d03802c989f7c0cd484b90121ca6b57688cf71e8ec8c61fbf62d415c8" exitCode=0 Apr 24 17:23:21.176736 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.176630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerDied","Data":"8792432d03802c989f7c0cd484b90121ca6b57688cf71e8ec8c61fbf62d415c8"} Apr 24 17:23:21.191522 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.191322 2573 scope.go:117] "RemoveContainer" containerID="02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3" Apr 24 17:23:21.200509 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.200480 2573 scope.go:117] "RemoveContainer" containerID="3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26" Apr 24 17:23:21.212022 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.211991 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2"] Apr 24 17:23:21.213888 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.213854 2573 scope.go:117] "RemoveContainer" containerID="320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6" Apr 24 17:23:21.214224 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:23:21.214185 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6\": container with ID starting with 320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6 not found: ID does not exist" containerID="320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6" Apr 24 17:23:21.214379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.214232 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6"} err="failed to get container status \"320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6\": rpc error: code = NotFound desc = could not find container \"320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6\": container with ID starting with 320d31886ef05d6465a16a17e2fdb2c4f31cabf9298c50a49fbc5af9fe41a2b6 not found: ID does not exist" Apr 24 17:23:21.214379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.214252 2573 scope.go:117] "RemoveContainer" containerID="02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3" Apr 24 17:23:21.214590 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:23:21.214568 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3\": container with ID starting with 02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3 not found: ID does not exist" containerID="02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3" Apr 24 17:23:21.214652 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.214596 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3"} err="failed to get container status \"02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3\": rpc error: code = NotFound desc = could not find container \"02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3\": container with ID starting with 02eeccf6841014043a3bba5011e8d6e68b6736464a582358d90e2e52fddaa2b3 not found: ID does not exist" Apr 24 17:23:21.214652 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.214611 2573 scope.go:117] "RemoveContainer" containerID="3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26" Apr 24 17:23:21.214833 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.214811 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-8p2b2"] Apr 24 17:23:21.214887 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:23:21.214849 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26\": container with ID starting with 3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26 not found: ID does not exist" containerID="3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26" Apr 24 17:23:21.214887 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:21.214868 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26"} err="failed to get container status \"3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26\": rpc error: code = NotFound desc = could not find container \"3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26\": container with ID starting with 3de66a1c3701c5965d15b1be95d17eaf10b4f688a6d983837458b2367ac89d26 not found: ID does not exist" Apr 24 17:23:22.181935 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:22.181898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerStarted","Data":"7d9e5ef58a196e84b8df2b5b7b57f5d361c8321337b18a0190a988c0b14c97fa"} Apr 24 17:23:22.181935 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:22.181936 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerStarted","Data":"ca3185b7ba2e0dd60907f634a0c57141638368897b68bd7bb88c46237fc33065"} Apr 24 17:23:22.182426 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:22.182146 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:22.201202 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:22.201150 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podStartSLOduration=6.201133229 podStartE2EDuration="6.201133229s" podCreationTimestamp="2026-04-24 17:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:23:22.198806782 +0000 UTC m=+2660.463522181" watchObservedRunningTime="2026-04-24 17:23:22.201133229 +0000 UTC m=+2660.465848628" Apr 24 17:23:22.217451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:22.217419 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" path="/var/lib/kubelet/pods/15d8fc68-b719-487e-9786-5e5156bfbde4/volumes" Apr 24 17:23:23.185026 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:23.184985 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:23.186386 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:23.186353 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:23:24.188110 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:24.188059 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:23:29.193156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:29.193119 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:23:29.193805 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:29.193774 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:23:39.193748 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:39.193707 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:23:49.193809 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:49.193768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:23:59.194587 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:23:59.194494 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:24:02.291521 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:02.291480 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:24:02.297708 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:02.297679 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:24:09.194011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:09.193964 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:24:19.194366 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:19.194299 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:24:29.194474 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:29.194437 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:24:36.654747 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.654710 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl"] Apr 24 17:24:36.655175 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.655056 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" containerID="cri-o://ca3185b7ba2e0dd60907f634a0c57141638368897b68bd7bb88c46237fc33065" gracePeriod=30 Apr 24 17:24:36.655175 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.655090 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kube-rbac-proxy" containerID="cri-o://7d9e5ef58a196e84b8df2b5b7b57f5d361c8321337b18a0190a988c0b14c97fa" gracePeriod=30 Apr 24 17:24:36.738486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738444 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv"] Apr 24 17:24:36.738813 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738798 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kube-rbac-proxy" Apr 24 17:24:36.738892 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738815 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kube-rbac-proxy" Apr 24 17:24:36.738892 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738829 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="storage-initializer" Apr 24 17:24:36.738892 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738835 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="storage-initializer" Apr 24 17:24:36.738892 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738845 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" Apr 24 17:24:36.738892 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738855 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" Apr 24 17:24:36.739077 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738914 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kserve-container" Apr 24 17:24:36.739077 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.738924 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="15d8fc68-b719-487e-9786-5e5156bfbde4" containerName="kube-rbac-proxy" Apr 24 17:24:36.742142 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.742121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.757288 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.757261 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 17:24:36.758715 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.758685 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 24 17:24:36.764232 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.764203 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv"] Apr 24 17:24:36.841362 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.841288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs89z\" (UniqueName: \"kubernetes.io/projected/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kube-api-access-hs89z\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.841569 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.841413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.841569 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.841455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.841569 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.841489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.942917 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.942812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs89z\" (UniqueName: \"kubernetes.io/projected/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kube-api-access-hs89z\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.942917 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.942865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.942917 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.942889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.943213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.942924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.943213 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:24:36.942981 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-serving-cert: secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 24 17:24:36.943213 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:24:36.943055 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls podName:11af2bde-bdaa-4586-9b12-b3f9bfea65fb nodeName:}" failed. No retries permitted until 2026-04-24 17:24:37.443034006 +0000 UTC m=+2735.707749385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls") pod "sklearn-v2-mlserver-predictor-65d8664766-sg2gv" (UID: "11af2bde-bdaa-4586-9b12-b3f9bfea65fb") : secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 24 17:24:36.943433 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.943409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.943696 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.943679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:36.953415 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:36.953388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs89z\" (UniqueName: \"kubernetes.io/projected/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kube-api-access-hs89z\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:37.405547 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:37.405502 2573 generic.go:358] "Generic (PLEG): container finished" podID="d69f1970-ea88-482c-b826-8889e19df0bf" containerID="7d9e5ef58a196e84b8df2b5b7b57f5d361c8321337b18a0190a988c0b14c97fa" exitCode=2 Apr 24 17:24:37.405742 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:37.405569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerDied","Data":"7d9e5ef58a196e84b8df2b5b7b57f5d361c8321337b18a0190a988c0b14c97fa"} Apr 24 17:24:37.448083 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:37.448041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:37.450803 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:37.450782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-sg2gv\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:37.652556 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:37.652512 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:37.785666 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:37.785628 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv"] Apr 24 17:24:37.788914 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:24:37.788876 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11af2bde_bdaa_4586_9b12_b3f9bfea65fb.slice/crio-df414c9dad7d56c8bdf11248672a22c9548d2d4647d0159b6a16f8921df762d5 WatchSource:0}: Error finding container df414c9dad7d56c8bdf11248672a22c9548d2d4647d0159b6a16f8921df762d5: Status 404 returned error can't find the container with id df414c9dad7d56c8bdf11248672a22c9548d2d4647d0159b6a16f8921df762d5 Apr 24 17:24:38.409208 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:38.409155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerStarted","Data":"33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447"} Apr 24 17:24:38.409208 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:38.409211 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerStarted","Data":"df414c9dad7d56c8bdf11248672a22c9548d2d4647d0159b6a16f8921df762d5"} Apr 24 17:24:39.189179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:39.189125 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.46:8643/healthz\": dial tcp 10.134.0.46:8643: connect: connection refused" Apr 24 17:24:39.194351 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:39.194288 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 24 17:24:41.420045 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.420005 2573 generic.go:358] "Generic (PLEG): container finished" podID="d69f1970-ea88-482c-b826-8889e19df0bf" containerID="ca3185b7ba2e0dd60907f634a0c57141638368897b68bd7bb88c46237fc33065" exitCode=0 Apr 24 17:24:41.420435 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.420063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerDied","Data":"ca3185b7ba2e0dd60907f634a0c57141638368897b68bd7bb88c46237fc33065"} Apr 24 17:24:41.507693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.507666 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:24:41.585637 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.585535 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d69f1970-ea88-482c-b826-8889e19df0bf-proxy-tls\") pod \"d69f1970-ea88-482c-b826-8889e19df0bf\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " Apr 24 17:24:41.585637 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.585578 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxjcj\" (UniqueName: \"kubernetes.io/projected/d69f1970-ea88-482c-b826-8889e19df0bf-kube-api-access-gxjcj\") pod \"d69f1970-ea88-482c-b826-8889e19df0bf\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " Apr 24 17:24:41.585637 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.585609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d69f1970-ea88-482c-b826-8889e19df0bf-kserve-provision-location\") pod \"d69f1970-ea88-482c-b826-8889e19df0bf\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " Apr 24 17:24:41.585949 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.585685 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d69f1970-ea88-482c-b826-8889e19df0bf-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"d69f1970-ea88-482c-b826-8889e19df0bf\" (UID: \"d69f1970-ea88-482c-b826-8889e19df0bf\") " Apr 24 17:24:41.586024 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.585997 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d69f1970-ea88-482c-b826-8889e19df0bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d69f1970-ea88-482c-b826-8889e19df0bf" (UID: "d69f1970-ea88-482c-b826-8889e19df0bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:24:41.586088 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.586056 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d69f1970-ea88-482c-b826-8889e19df0bf-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "d69f1970-ea88-482c-b826-8889e19df0bf" (UID: "d69f1970-ea88-482c-b826-8889e19df0bf"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:24:41.587847 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.587823 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69f1970-ea88-482c-b826-8889e19df0bf-kube-api-access-gxjcj" (OuterVolumeSpecName: "kube-api-access-gxjcj") pod "d69f1970-ea88-482c-b826-8889e19df0bf" (UID: "d69f1970-ea88-482c-b826-8889e19df0bf"). InnerVolumeSpecName "kube-api-access-gxjcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:24:41.587933 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.587886 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69f1970-ea88-482c-b826-8889e19df0bf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d69f1970-ea88-482c-b826-8889e19df0bf" (UID: "d69f1970-ea88-482c-b826-8889e19df0bf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:24:41.687161 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.687128 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d69f1970-ea88-482c-b826-8889e19df0bf-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:24:41.687161 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.687162 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d69f1970-ea88-482c-b826-8889e19df0bf-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:24:41.687352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.687173 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxjcj\" (UniqueName: \"kubernetes.io/projected/d69f1970-ea88-482c-b826-8889e19df0bf-kube-api-access-gxjcj\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:24:41.687352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:41.687183 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d69f1970-ea88-482c-b826-8889e19df0bf-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:24:42.424880 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.424839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" event={"ID":"d69f1970-ea88-482c-b826-8889e19df0bf","Type":"ContainerDied","Data":"f5e6143bd20ab358a01517e2fab769243fd99a8ce0d918895010a07f7e6ae7c1"} Apr 24 17:24:42.424880 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.424873 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl" Apr 24 17:24:42.424880 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.424890 2573 scope.go:117] "RemoveContainer" containerID="7d9e5ef58a196e84b8df2b5b7b57f5d361c8321337b18a0190a988c0b14c97fa" Apr 24 17:24:42.426295 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.426268 2573 generic.go:358] "Generic (PLEG): container finished" podID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerID="33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447" exitCode=0 Apr 24 17:24:42.426445 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.426351 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerDied","Data":"33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447"} Apr 24 17:24:42.433565 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.433548 2573 scope.go:117] "RemoveContainer" containerID="ca3185b7ba2e0dd60907f634a0c57141638368897b68bd7bb88c46237fc33065" Apr 24 17:24:42.441637 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.441600 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl"] Apr 24 17:24:42.442096 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.442072 2573 scope.go:117] "RemoveContainer" containerID="8792432d03802c989f7c0cd484b90121ca6b57688cf71e8ec8c61fbf62d415c8" Apr 24 17:24:42.446098 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:42.446070 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6cff579875-rbdsl"] Apr 24 17:24:43.431002 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:43.430958 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerStarted","Data":"71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3"} Apr 24 17:24:43.431517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:43.431008 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerStarted","Data":"3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716"} Apr 24 17:24:43.431517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:43.431252 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:43.456449 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:43.456394 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" podStartSLOduration=7.456370536 podStartE2EDuration="7.456370536s" podCreationTimestamp="2026-04-24 17:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:24:43.453092423 +0000 UTC m=+2741.717807821" watchObservedRunningTime="2026-04-24 17:24:43.456370536 +0000 UTC m=+2741.721085936" Apr 24 17:24:44.217079 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:44.217032 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" path="/var/lib/kubelet/pods/d69f1970-ea88-482c-b826-8889e19df0bf/volumes" Apr 24 17:24:44.434829 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:44.434794 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:24:50.443211 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:24:50.443181 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:25:20.528435 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:20.528339 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 17:25:30.446484 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:30.446449 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:25:36.830162 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.830111 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv"] Apr 24 17:25:36.830881 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.830564 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kserve-container" containerID="cri-o://3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716" gracePeriod=30 Apr 24 17:25:36.830881 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.830644 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kube-rbac-proxy" containerID="cri-o://71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3" gracePeriod=30 Apr 24 17:25:36.926706 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.926664 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx"] Apr 24 17:25:36.927048 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927033 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="storage-initializer" Apr 24 17:25:36.927097 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927051 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="storage-initializer" Apr 24 17:25:36.927097 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927066 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kube-rbac-proxy" Apr 24 17:25:36.927097 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927072 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kube-rbac-proxy" Apr 24 17:25:36.927097 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927093 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" Apr 24 17:25:36.927226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927102 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" Apr 24 17:25:36.927226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927148 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kube-rbac-proxy" Apr 24 17:25:36.927226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.927159 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d69f1970-ea88-482c-b826-8889e19df0bf" containerName="kserve-container" Apr 24 17:25:36.930530 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.930507 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:36.934320 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.934279 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 24 17:25:36.934476 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.934300 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:25:36.949204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:36.949174 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx"] Apr 24 17:25:37.049704 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.049656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9fba819-8548-45e6-b064-6ad2784b96cd-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.049895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.049726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9fba819-8548-45e6-b064-6ad2784b96cd-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.049895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.049795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkdlf\" (UniqueName: \"kubernetes.io/projected/a9fba819-8548-45e6-b064-6ad2784b96cd-kube-api-access-mkdlf\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.049895 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.049835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fba819-8548-45e6-b064-6ad2784b96cd-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.151118 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.151006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9fba819-8548-45e6-b064-6ad2784b96cd-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.151118 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.151063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkdlf\" (UniqueName: \"kubernetes.io/projected/a9fba819-8548-45e6-b064-6ad2784b96cd-kube-api-access-mkdlf\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.151118 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.151087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fba819-8548-45e6-b064-6ad2784b96cd-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.151460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.151141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9fba819-8548-45e6-b064-6ad2784b96cd-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.151574 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.151539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fba819-8548-45e6-b064-6ad2784b96cd-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.151839 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.151817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9fba819-8548-45e6-b064-6ad2784b96cd-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.153889 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.153867 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9fba819-8548-45e6-b064-6ad2784b96cd-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.159903 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.159877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkdlf\" (UniqueName: \"kubernetes.io/projected/a9fba819-8548-45e6-b064-6ad2784b96cd-kube-api-access-mkdlf\") pod \"isvc-sklearn-runtime-predictor-588cd468b5-6w9mx\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.241972 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.241926 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:37.377216 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.377183 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx"] Apr 24 17:25:37.380466 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:25:37.380436 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fba819_8548_45e6_b064_6ad2784b96cd.slice/crio-b4dd3c7652638f6ebb062b8cf262d51ced2781119e90fe34a01e7c6237e9ca74 WatchSource:0}: Error finding container b4dd3c7652638f6ebb062b8cf262d51ced2781119e90fe34a01e7c6237e9ca74: Status 404 returned error can't find the container with id b4dd3c7652638f6ebb062b8cf262d51ced2781119e90fe34a01e7c6237e9ca74 Apr 24 17:25:37.594532 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.594494 2573 generic.go:358] "Generic (PLEG): container finished" podID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerID="71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3" exitCode=2 Apr 24 17:25:37.594725 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.594568 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerDied","Data":"71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3"} Apr 24 17:25:37.595969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.595930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerStarted","Data":"33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882"} Apr 24 17:25:37.595969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:37.595964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerStarted","Data":"b4dd3c7652638f6ebb062b8cf262d51ced2781119e90fe34a01e7c6237e9ca74"} Apr 24 17:25:40.437935 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:40.437883 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.47:8643/healthz\": dial tcp 10.134.0.47:8643: connect: connection refused" Apr 24 17:25:42.612302 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:42.612268 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerID="33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882" exitCode=0 Apr 24 17:25:42.612728 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:42.612342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerDied","Data":"33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882"} Apr 24 17:25:43.617545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:43.617504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerStarted","Data":"3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19"} Apr 24 17:25:43.617545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:43.617549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerStarted","Data":"80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd"} Apr 24 17:25:43.618013 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:43.617841 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:43.618013 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:43.617987 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:43.619357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:43.619325 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 17:25:43.636921 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:43.636861 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podStartSLOduration=7.636845363 podStartE2EDuration="7.636845363s" podCreationTimestamp="2026-04-24 17:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:25:43.635994305 +0000 UTC m=+2801.900709703" watchObservedRunningTime="2026-04-24 17:25:43.636845363 +0000 UTC m=+2801.901560738" Apr 24 17:25:44.621552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:44.621507 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 17:25:45.398924 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.398896 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:25:45.529691 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.529658 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " Apr 24 17:25:45.529906 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.529716 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs89z\" (UniqueName: \"kubernetes.io/projected/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kube-api-access-hs89z\") pod \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " Apr 24 17:25:45.529906 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.529738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls\") pod \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " Apr 24 17:25:45.529906 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.529779 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kserve-provision-location\") pod \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\" (UID: \"11af2bde-bdaa-4586-9b12-b3f9bfea65fb\") " Apr 24 17:25:45.530078 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.530040 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "11af2bde-bdaa-4586-9b12-b3f9bfea65fb" (UID: "11af2bde-bdaa-4586-9b12-b3f9bfea65fb"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:25:45.530249 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.530221 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "11af2bde-bdaa-4586-9b12-b3f9bfea65fb" (UID: "11af2bde-bdaa-4586-9b12-b3f9bfea65fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:25:45.532112 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.532088 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kube-api-access-hs89z" (OuterVolumeSpecName: "kube-api-access-hs89z") pod "11af2bde-bdaa-4586-9b12-b3f9bfea65fb" (UID: "11af2bde-bdaa-4586-9b12-b3f9bfea65fb"). InnerVolumeSpecName "kube-api-access-hs89z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:25:45.532196 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.532119 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "11af2bde-bdaa-4586-9b12-b3f9bfea65fb" (UID: "11af2bde-bdaa-4586-9b12-b3f9bfea65fb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:25:45.626945 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.626894 2573 generic.go:358] "Generic (PLEG): container finished" podID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerID="3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716" exitCode=0 Apr 24 17:25:45.627471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.626946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerDied","Data":"3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716"} Apr 24 17:25:45.627471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.626980 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" Apr 24 17:25:45.627471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.626990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv" event={"ID":"11af2bde-bdaa-4586-9b12-b3f9bfea65fb","Type":"ContainerDied","Data":"df414c9dad7d56c8bdf11248672a22c9548d2d4647d0159b6a16f8921df762d5"} Apr 24 17:25:45.627471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.627009 2573 scope.go:117] "RemoveContainer" containerID="71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3" Apr 24 17:25:45.630585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.630548 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:25:45.630761 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.630589 2573 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:25:45.630761 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.630608 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hs89z\" (UniqueName: \"kubernetes.io/projected/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-kube-api-access-hs89z\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:25:45.630761 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.630623 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/11af2bde-bdaa-4586-9b12-b3f9bfea65fb-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:25:45.636091 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.636064 2573 scope.go:117] "RemoveContainer" containerID="3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716" Apr 24 17:25:45.644415 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.644393 2573 scope.go:117] "RemoveContainer" containerID="33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447" Apr 24 17:25:45.650088 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.650059 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv"] Apr 24 17:25:45.653499 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.653469 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-sg2gv"] Apr 24 17:25:45.654014 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.653999 2573 scope.go:117] "RemoveContainer" containerID="71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3" Apr 24 17:25:45.654388 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:25:45.654362 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3\": container with ID starting with 71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3 not found: ID does not exist" containerID="71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3" Apr 24 17:25:45.654507 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.654397 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3"} err="failed to get container status \"71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3\": rpc error: code = NotFound desc = could not find container \"71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3\": container with ID starting with 71dca6dde13a134cffb33bda68fae1812fee324a4281dbdd02ad14a128fbaec3 not found: ID does not exist" Apr 24 17:25:45.654507 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.654421 2573 scope.go:117] "RemoveContainer" containerID="3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716" Apr 24 17:25:45.654681 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:25:45.654660 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716\": container with ID starting with 3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716 not found: ID does not exist" containerID="3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716" Apr 24 17:25:45.654735 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.654686 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716"} err="failed to get container status \"3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716\": rpc error: code = NotFound desc = could not find container \"3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716\": container with ID starting with 3cb1f72d3c245faafa718707ba0585323d0727e163a172364c46db178d377716 not found: ID does not exist" Apr 24 17:25:45.654735 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.654700 2573 scope.go:117] "RemoveContainer" containerID="33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447" Apr 24 17:25:45.654916 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:25:45.654898 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447\": container with ID starting with 33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447 not found: ID does not exist" containerID="33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447" Apr 24 17:25:45.654976 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:45.654928 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447"} err="failed to get container status \"33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447\": rpc error: code = NotFound desc = could not find container \"33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447\": container with ID starting with 33434172d8495de6abcb3acee6f58179447b2e8346a54a481667753c64196447 not found: ID does not exist" Apr 24 17:25:46.217023 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:46.216974 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" path="/var/lib/kubelet/pods/11af2bde-bdaa-4586-9b12-b3f9bfea65fb/volumes" Apr 24 17:25:49.626390 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:49.626356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:25:49.627058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:49.626896 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 24 17:25:59.627520 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:25:59.627475 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:26:13.964181 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:13.964136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-588cd468b5-6w9mx_a9fba819-8548-45e6-b064-6ad2784b96cd/kserve-container/0.log" Apr 24 17:26:14.088031 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.087994 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx"] Apr 24 17:26:14.088552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.088496 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" containerID="cri-o://80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd" gracePeriod=30 Apr 24 17:26:14.088749 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.088676 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kube-rbac-proxy" containerID="cri-o://3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19" gracePeriod=30 Apr 24 17:26:14.179497 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179454 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj"] Apr 24 17:26:14.179831 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179816 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="storage-initializer" Apr 24 17:26:14.179896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179834 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="storage-initializer" Apr 24 17:26:14.179896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179849 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kserve-container" Apr 24 17:26:14.179896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179856 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kserve-container" Apr 24 17:26:14.179896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179871 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kube-rbac-proxy" Apr 24 17:26:14.179896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179877 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kube-rbac-proxy" Apr 24 17:26:14.180047 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179933 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kserve-container" Apr 24 17:26:14.180047 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.179942 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="11af2bde-bdaa-4586-9b12-b3f9bfea65fb" containerName="kube-rbac-proxy" Apr 24 17:26:14.183361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.183296 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.185429 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.185402 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 24 17:26:14.185534 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.185426 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:26:14.193522 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.193494 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj"] Apr 24 17:26:14.253018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.252982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58c509e8-23df-42b6-aad6-7ce3b17a1839-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.253018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.253020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4cbr\" (UniqueName: \"kubernetes.io/projected/58c509e8-23df-42b6-aad6-7ce3b17a1839-kube-api-access-s4cbr\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.253247 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.253088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58c509e8-23df-42b6-aad6-7ce3b17a1839-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.253247 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.253132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58c509e8-23df-42b6-aad6-7ce3b17a1839-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.353555 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.353511 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58c509e8-23df-42b6-aad6-7ce3b17a1839-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.353555 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.353558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58c509e8-23df-42b6-aad6-7ce3b17a1839-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.353822 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.353626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58c509e8-23df-42b6-aad6-7ce3b17a1839-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.353822 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.353643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4cbr\" (UniqueName: \"kubernetes.io/projected/58c509e8-23df-42b6-aad6-7ce3b17a1839-kube-api-access-s4cbr\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.354055 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.354032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58c509e8-23df-42b6-aad6-7ce3b17a1839-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.354388 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.354366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58c509e8-23df-42b6-aad6-7ce3b17a1839-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.356239 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.356218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58c509e8-23df-42b6-aad6-7ce3b17a1839-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.361827 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.361801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4cbr\" (UniqueName: \"kubernetes.io/projected/58c509e8-23df-42b6-aad6-7ce3b17a1839-kube-api-access-s4cbr\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.495148 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.495107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:14.621924 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.621880 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.48:8643/healthz\": dial tcp 10.134.0.48:8643: connect: connection refused" Apr 24 17:26:14.634512 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.634484 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj"] Apr 24 17:26:14.637357 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:26:14.637302 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c509e8_23df_42b6_aad6_7ce3b17a1839.slice/crio-2d3e76ffc75b2fe4330cde552e2522da4a6a11ffd6d7a0bedc11ace358161eb7 WatchSource:0}: Error finding container 2d3e76ffc75b2fe4330cde552e2522da4a6a11ffd6d7a0bedc11ace358161eb7: Status 404 returned error can't find the container with id 2d3e76ffc75b2fe4330cde552e2522da4a6a11ffd6d7a0bedc11ace358161eb7 Apr 24 17:26:14.717754 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.717716 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerID="3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19" exitCode=2 Apr 24 17:26:14.717940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.717800 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerDied","Data":"3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19"} Apr 24 17:26:14.719148 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.719119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerStarted","Data":"3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2"} Apr 24 17:26:14.719271 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:14.719155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerStarted","Data":"2d3e76ffc75b2fe4330cde552e2522da4a6a11ffd6d7a0bedc11ace358161eb7"} Apr 24 17:26:15.128621 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.128589 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:26:15.160378 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.160339 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9fba819-8548-45e6-b064-6ad2784b96cd-proxy-tls\") pod \"a9fba819-8548-45e6-b064-6ad2784b96cd\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " Apr 24 17:26:15.160559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.160427 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fba819-8548-45e6-b064-6ad2784b96cd-kserve-provision-location\") pod \"a9fba819-8548-45e6-b064-6ad2784b96cd\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " Apr 24 17:26:15.160559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.160484 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkdlf\" (UniqueName: \"kubernetes.io/projected/a9fba819-8548-45e6-b064-6ad2784b96cd-kube-api-access-mkdlf\") pod \"a9fba819-8548-45e6-b064-6ad2784b96cd\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " Apr 24 17:26:15.160559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.160527 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9fba819-8548-45e6-b064-6ad2784b96cd-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"a9fba819-8548-45e6-b064-6ad2784b96cd\" (UID: \"a9fba819-8548-45e6-b064-6ad2784b96cd\") " Apr 24 17:26:15.161031 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.161003 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fba819-8548-45e6-b064-6ad2784b96cd-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "a9fba819-8548-45e6-b064-6ad2784b96cd" (UID: "a9fba819-8548-45e6-b064-6ad2784b96cd"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:26:15.163113 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.163079 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fba819-8548-45e6-b064-6ad2784b96cd-kube-api-access-mkdlf" (OuterVolumeSpecName: "kube-api-access-mkdlf") pod "a9fba819-8548-45e6-b064-6ad2784b96cd" (UID: "a9fba819-8548-45e6-b064-6ad2784b96cd"). InnerVolumeSpecName "kube-api-access-mkdlf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:26:15.163229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.163191 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fba819-8548-45e6-b064-6ad2784b96cd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a9fba819-8548-45e6-b064-6ad2784b96cd" (UID: "a9fba819-8548-45e6-b064-6ad2784b96cd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:26:15.191444 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.191350 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9fba819-8548-45e6-b064-6ad2784b96cd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9fba819-8548-45e6-b064-6ad2784b96cd" (UID: "a9fba819-8548-45e6-b064-6ad2784b96cd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:26:15.261529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.261486 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mkdlf\" (UniqueName: \"kubernetes.io/projected/a9fba819-8548-45e6-b064-6ad2784b96cd-kube-api-access-mkdlf\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:26:15.261529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.261522 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9fba819-8548-45e6-b064-6ad2784b96cd-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:26:15.261529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.261535 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9fba819-8548-45e6-b064-6ad2784b96cd-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:26:15.261771 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.261545 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fba819-8548-45e6-b064-6ad2784b96cd-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:26:15.723929 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.723892 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerID="80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd" exitCode=0 Apr 24 17:26:15.724146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.723980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerDied","Data":"80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd"} Apr 24 17:26:15.724146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.723992 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" Apr 24 17:26:15.724146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.724019 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx" event={"ID":"a9fba819-8548-45e6-b064-6ad2784b96cd","Type":"ContainerDied","Data":"b4dd3c7652638f6ebb062b8cf262d51ced2781119e90fe34a01e7c6237e9ca74"} Apr 24 17:26:15.724146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.724035 2573 scope.go:117] "RemoveContainer" containerID="3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19" Apr 24 17:26:15.732366 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.732349 2573 scope.go:117] "RemoveContainer" containerID="80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd" Apr 24 17:26:15.740847 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.740820 2573 scope.go:117] "RemoveContainer" containerID="33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882" Apr 24 17:26:15.745079 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.745048 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx"] Apr 24 17:26:15.748980 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.748951 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-588cd468b5-6w9mx"] Apr 24 17:26:15.749260 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.749242 2573 scope.go:117] "RemoveContainer" containerID="3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19" Apr 24 17:26:15.749619 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:26:15.749599 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19\": container with ID starting with 3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19 not found: ID does not exist" containerID="3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19" Apr 24 17:26:15.749677 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.749629 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19"} err="failed to get container status \"3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19\": rpc error: code = NotFound desc = could not find container \"3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19\": container with ID starting with 3b5cb65b293890f8abbdaa3736e34aad1dce01a71cf3d877c38245a8d72a2e19 not found: ID does not exist" Apr 24 17:26:15.749677 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.749649 2573 scope.go:117] "RemoveContainer" containerID="80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd" Apr 24 17:26:15.749891 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:26:15.749873 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd\": container with ID starting with 80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd not found: ID does not exist" containerID="80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd" Apr 24 17:26:15.749957 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.749901 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd"} err="failed to get container status \"80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd\": rpc error: code = NotFound desc = could not find container \"80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd\": container with ID starting with 80c497abda319a8a60dec4a5a6f643dddb4ccb5d96da77b70a6c81b41ea877fd not found: ID does not exist" Apr 24 17:26:15.749957 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.749927 2573 scope.go:117] "RemoveContainer" containerID="33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882" Apr 24 17:26:15.750129 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:26:15.750112 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882\": container with ID starting with 33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882 not found: ID does not exist" containerID="33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882" Apr 24 17:26:15.750168 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:15.750133 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882"} err="failed to get container status \"33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882\": rpc error: code = NotFound desc = could not find container \"33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882\": container with ID starting with 33692c4143f53143bd323bc5535a3f646d29c5cf3a018fcc30c4f8f89cafb882 not found: ID does not exist" Apr 24 17:26:16.218186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:16.218143 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" path="/var/lib/kubelet/pods/a9fba819-8548-45e6-b064-6ad2784b96cd/volumes" Apr 24 17:26:18.735432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:18.735239 2573 generic.go:358] "Generic (PLEG): container finished" podID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerID="3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2" exitCode=0 Apr 24 17:26:18.735432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:18.735292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerDied","Data":"3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2"} Apr 24 17:26:19.739604 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:19.739566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerStarted","Data":"40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6"} Apr 24 17:26:19.739604 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:19.739610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerStarted","Data":"269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89"} Apr 24 17:26:19.740133 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:19.739857 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:19.740133 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:19.739888 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:19.761199 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:19.761130 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podStartSLOduration=5.761107105 podStartE2EDuration="5.761107105s" podCreationTimestamp="2026-04-24 17:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:26:19.75881279 +0000 UTC m=+2838.023528244" watchObservedRunningTime="2026-04-24 17:26:19.761107105 +0000 UTC m=+2838.025822504" Apr 24 17:26:25.747800 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:25.747768 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:26:55.827922 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:26:55.827824 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 17:27:05.750945 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:05.750906 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:27:14.256464 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.256421 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj"] Apr 24 17:27:14.256909 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.256747 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kserve-container" containerID="cri-o://269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89" gracePeriod=30 Apr 24 17:27:14.256909 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.256793 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kube-rbac-proxy" containerID="cri-o://40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6" gracePeriod=30 Apr 24 17:27:14.340891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.340851 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp"] Apr 24 17:27:14.341165 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341153 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" Apr 24 17:27:14.341207 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341167 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" Apr 24 17:27:14.341207 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341188 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="storage-initializer" Apr 24 17:27:14.341207 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341194 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="storage-initializer" Apr 24 17:27:14.341207 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341205 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kube-rbac-proxy" Apr 24 17:27:14.341368 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341211 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kube-rbac-proxy" Apr 24 17:27:14.341368 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341257 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kserve-container" Apr 24 17:27:14.341368 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.341264 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9fba819-8548-45e6-b064-6ad2784b96cd" containerName="kube-rbac-proxy" Apr 24 17:27:14.348184 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.348142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.348586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.348529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/136aba73-6488-478f-96da-bd551f727cef-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.348698 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.348655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.348755 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.348696 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/136aba73-6488-478f-96da-bd551f727cef-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.348755 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.348724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q2gv\" (UniqueName: \"kubernetes.io/projected/136aba73-6488-478f-96da-bd551f727cef-kube-api-access-9q2gv\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.350704 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.350489 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 24 17:27:14.350844 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.350700 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 17:27:14.352674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.352649 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp"] Apr 24 17:27:14.449697 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.449655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.449911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.449713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/136aba73-6488-478f-96da-bd551f727cef-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.449911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.449741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q2gv\" (UniqueName: \"kubernetes.io/projected/136aba73-6488-478f-96da-bd551f727cef-kube-api-access-9q2gv\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.449911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.449775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/136aba73-6488-478f-96da-bd551f727cef-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.449911 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:27:14.449819 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 17:27:14.449911 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:27:14.449910 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls podName:136aba73-6488-478f-96da-bd551f727cef nodeName:}" failed. No retries permitted until 2026-04-24 17:27:14.949886362 +0000 UTC m=+2893.214601740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls") pod "isvc-sklearn-v2-predictor-6779b78747-8rqlp" (UID: "136aba73-6488-478f-96da-bd551f727cef") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 17:27:14.450302 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.450277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/136aba73-6488-478f-96da-bd551f727cef-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.450601 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.450582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/136aba73-6488-478f-96da-bd551f727cef-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.461696 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.461658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q2gv\" (UniqueName: \"kubernetes.io/projected/136aba73-6488-478f-96da-bd551f727cef-kube-api-access-9q2gv\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.905262 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.905224 2573 generic.go:358] "Generic (PLEG): container finished" podID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerID="40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6" exitCode=2 Apr 24 17:27:14.905457 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.905300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerDied","Data":"40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6"} Apr 24 17:27:14.953741 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.953699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.956418 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.956389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls\") pod \"isvc-sklearn-v2-predictor-6779b78747-8rqlp\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:14.960319 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:14.960280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:15.089150 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:15.089110 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp"] Apr 24 17:27:15.092353 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:27:15.092300 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod136aba73_6488_478f_96da_bd551f727cef.slice/crio-b2c0cb44733b00132467028586851e69d023305f85dc9e8b5fadddffa566afc7 WatchSource:0}: Error finding container b2c0cb44733b00132467028586851e69d023305f85dc9e8b5fadddffa566afc7: Status 404 returned error can't find the container with id b2c0cb44733b00132467028586851e69d023305f85dc9e8b5fadddffa566afc7 Apr 24 17:27:15.743188 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:15.743139 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.49:8643/healthz\": dial tcp 10.134.0.49:8643: connect: connection refused" Apr 24 17:27:15.909529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:15.909484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerStarted","Data":"ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77"} Apr 24 17:27:15.909685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:15.909537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerStarted","Data":"b2c0cb44733b00132467028586851e69d023305f85dc9e8b5fadddffa566afc7"} Apr 24 17:27:16.789550 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:16.789499 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 17:27:18.920434 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:18.920399 2573 generic.go:358] "Generic (PLEG): container finished" podID="136aba73-6488-478f-96da-bd551f727cef" containerID="ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77" exitCode=0 Apr 24 17:27:18.920848 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:18.920484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerDied","Data":"ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77"} Apr 24 17:27:19.925552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:19.925515 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerStarted","Data":"9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6"} Apr 24 17:27:19.925552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:19.925559 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerStarted","Data":"39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d"} Apr 24 17:27:19.926066 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:19.925857 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:19.944523 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:19.944456 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podStartSLOduration=5.944437597 podStartE2EDuration="5.944437597s" podCreationTimestamp="2026-04-24 17:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:27:19.943076917 +0000 UTC m=+2898.207792315" watchObservedRunningTime="2026-04-24 17:27:19.944437597 +0000 UTC m=+2898.209153007" Apr 24 17:27:20.743500 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:20.743445 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.49:8643/healthz\": dial tcp 10.134.0.49:8643: connect: connection refused" Apr 24 17:27:20.929246 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:20.929207 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:20.930651 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:20.930611 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:27:21.932998 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:21.932956 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:27:22.698432 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.698400 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:27:22.716167 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.716124 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58c509e8-23df-42b6-aad6-7ce3b17a1839-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"58c509e8-23df-42b6-aad6-7ce3b17a1839\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " Apr 24 17:27:22.716167 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.716174 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58c509e8-23df-42b6-aad6-7ce3b17a1839-kserve-provision-location\") pod \"58c509e8-23df-42b6-aad6-7ce3b17a1839\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " Apr 24 17:27:22.716449 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.716194 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4cbr\" (UniqueName: \"kubernetes.io/projected/58c509e8-23df-42b6-aad6-7ce3b17a1839-kube-api-access-s4cbr\") pod \"58c509e8-23df-42b6-aad6-7ce3b17a1839\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " Apr 24 17:27:22.716449 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.716220 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58c509e8-23df-42b6-aad6-7ce3b17a1839-proxy-tls\") pod \"58c509e8-23df-42b6-aad6-7ce3b17a1839\" (UID: \"58c509e8-23df-42b6-aad6-7ce3b17a1839\") " Apr 24 17:27:22.716566 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.716496 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c509e8-23df-42b6-aad6-7ce3b17a1839-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58c509e8-23df-42b6-aad6-7ce3b17a1839" (UID: "58c509e8-23df-42b6-aad6-7ce3b17a1839"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:27:22.716566 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.716526 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58c509e8-23df-42b6-aad6-7ce3b17a1839-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "58c509e8-23df-42b6-aad6-7ce3b17a1839" (UID: "58c509e8-23df-42b6-aad6-7ce3b17a1839"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:27:22.719098 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.719070 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58c509e8-23df-42b6-aad6-7ce3b17a1839-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "58c509e8-23df-42b6-aad6-7ce3b17a1839" (UID: "58c509e8-23df-42b6-aad6-7ce3b17a1839"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:27:22.719325 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.719285 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c509e8-23df-42b6-aad6-7ce3b17a1839-kube-api-access-s4cbr" (OuterVolumeSpecName: "kube-api-access-s4cbr") pod "58c509e8-23df-42b6-aad6-7ce3b17a1839" (UID: "58c509e8-23df-42b6-aad6-7ce3b17a1839"). InnerVolumeSpecName "kube-api-access-s4cbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:27:22.817156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.817050 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58c509e8-23df-42b6-aad6-7ce3b17a1839-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:27:22.817156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.817090 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58c509e8-23df-42b6-aad6-7ce3b17a1839-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:27:22.817156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.817101 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4cbr\" (UniqueName: \"kubernetes.io/projected/58c509e8-23df-42b6-aad6-7ce3b17a1839-kube-api-access-s4cbr\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:27:22.817156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.817110 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58c509e8-23df-42b6-aad6-7ce3b17a1839-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:27:22.937912 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.937872 2573 generic.go:358] "Generic (PLEG): container finished" podID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerID="269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89" exitCode=0 Apr 24 17:27:22.938352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.937965 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" Apr 24 17:27:22.938352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.937962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerDied","Data":"269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89"} Apr 24 17:27:22.938352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.938010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj" event={"ID":"58c509e8-23df-42b6-aad6-7ce3b17a1839","Type":"ContainerDied","Data":"2d3e76ffc75b2fe4330cde552e2522da4a6a11ffd6d7a0bedc11ace358161eb7"} Apr 24 17:27:22.938352 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.938035 2573 scope.go:117] "RemoveContainer" containerID="40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6" Apr 24 17:27:22.947242 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.947223 2573 scope.go:117] "RemoveContainer" containerID="269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89" Apr 24 17:27:22.955022 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.954996 2573 scope.go:117] "RemoveContainer" containerID="3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2" Apr 24 17:27:22.959213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.959184 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj"] Apr 24 17:27:22.962977 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.962957 2573 scope.go:117] "RemoveContainer" containerID="40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6" Apr 24 17:27:22.963555 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:27:22.963411 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6\": container with ID starting with 40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6 not found: ID does not exist" containerID="40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6" Apr 24 17:27:22.963555 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.963455 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6"} err="failed to get container status \"40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6\": rpc error: code = NotFound desc = could not find container \"40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6\": container with ID starting with 40ed356957c174d515c61ca409a009b24077e5bdf029f706ba80caf88d0076e6 not found: ID does not exist" Apr 24 17:27:22.963555 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.963481 2573 scope.go:117] "RemoveContainer" containerID="269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89" Apr 24 17:27:22.963808 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:27:22.963773 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89\": container with ID starting with 269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89 not found: ID does not exist" containerID="269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89" Apr 24 17:27:22.963872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.963838 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89"} err="failed to get container status \"269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89\": rpc error: code = NotFound desc = could not find container \"269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89\": container with ID starting with 269cf09687f09c582b09b724348a870d19a2b42fec702e336e7a954ef74a2b89 not found: ID does not exist" Apr 24 17:27:22.963922 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.963876 2573 scope.go:117] "RemoveContainer" containerID="3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2" Apr 24 17:27:22.964673 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:27:22.964188 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2\": container with ID starting with 3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2 not found: ID does not exist" containerID="3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2" Apr 24 17:27:22.964673 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.964224 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2"} err="failed to get container status \"3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2\": rpc error: code = NotFound desc = could not find container \"3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2\": container with ID starting with 3c2c92c9f2ab8d06ef6dabbe35ac37658d0a19b41d2e2991a9d52603683dcbd2 not found: ID does not exist" Apr 24 17:27:22.966177 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:22.966151 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-k4lnj"] Apr 24 17:27:24.216632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:24.216597 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" path="/var/lib/kubelet/pods/58c509e8-23df-42b6-aad6-7ce3b17a1839/volumes" Apr 24 17:27:26.939356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:26.939326 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:27:26.939949 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:26.939920 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:27:36.939953 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:36.939912 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:27:46.940300 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:46.940248 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:27:56.939945 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:27:56.939896 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:28:06.940110 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:06.940064 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:28:16.939810 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:16.939767 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:28:26.940798 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:26.940716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:28:34.536814 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.536777 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp"] Apr 24 17:28:34.537251 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.537153 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" containerID="cri-o://39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d" gracePeriod=30 Apr 24 17:28:34.537251 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.537184 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kube-rbac-proxy" containerID="cri-o://9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6" gracePeriod=30 Apr 24 17:28:34.623667 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.623632 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq"] Apr 24 17:28:34.623980 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.623966 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kserve-container" Apr 24 17:28:34.624029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.623982 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kserve-container" Apr 24 17:28:34.624029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.623995 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="storage-initializer" Apr 24 17:28:34.624029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.624001 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="storage-initializer" Apr 24 17:28:34.624029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.624010 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kube-rbac-proxy" Apr 24 17:28:34.624029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.624015 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kube-rbac-proxy" Apr 24 17:28:34.624188 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.624073 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kserve-container" Apr 24 17:28:34.624188 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.624084 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="58c509e8-23df-42b6-aad6-7ce3b17a1839" containerName="kube-rbac-proxy" Apr 24 17:28:34.627440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.627416 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.629656 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.629627 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 24 17:28:34.629656 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.629648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 24 17:28:34.638712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.638682 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq"] Apr 24 17:28:34.732936 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.732885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed708dff-0422-4007-ba80-c98d484dd7c8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.733176 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.732972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed708dff-0422-4007-ba80-c98d484dd7c8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.733176 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.733025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfvg\" (UniqueName: \"kubernetes.io/projected/ed708dff-0422-4007-ba80-c98d484dd7c8-kube-api-access-9nfvg\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.733176 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.733109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed708dff-0422-4007-ba80-c98d484dd7c8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.833902 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.833801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed708dff-0422-4007-ba80-c98d484dd7c8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.833902 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.833853 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfvg\" (UniqueName: \"kubernetes.io/projected/ed708dff-0422-4007-ba80-c98d484dd7c8-kube-api-access-9nfvg\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.833902 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.833882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed708dff-0422-4007-ba80-c98d484dd7c8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.834232 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.833934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed708dff-0422-4007-ba80-c98d484dd7c8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.834366 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.834339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed708dff-0422-4007-ba80-c98d484dd7c8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.834651 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.834627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed708dff-0422-4007-ba80-c98d484dd7c8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.836621 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.836604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed708dff-0422-4007-ba80-c98d484dd7c8-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.842490 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.842464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfvg\" (UniqueName: \"kubernetes.io/projected/ed708dff-0422-4007-ba80-c98d484dd7c8-kube-api-access-9nfvg\") pod \"isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:34.939537 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:34.939504 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:35.070654 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:35.070618 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq"] Apr 24 17:28:35.074540 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:28:35.074510 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded708dff_0422_4007_ba80_c98d484dd7c8.slice/crio-fe007d34577895279fd3b99112a5b5f606ba41ef435f555e13abf3203ec3814f WatchSource:0}: Error finding container fe007d34577895279fd3b99112a5b5f606ba41ef435f555e13abf3203ec3814f: Status 404 returned error can't find the container with id fe007d34577895279fd3b99112a5b5f606ba41ef435f555e13abf3203ec3814f Apr 24 17:28:35.076502 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:35.076485 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:28:35.153915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:35.153875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerStarted","Data":"12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85"} Apr 24 17:28:35.153915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:35.153922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerStarted","Data":"fe007d34577895279fd3b99112a5b5f606ba41ef435f555e13abf3203ec3814f"} Apr 24 17:28:35.155940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:35.155908 2573 generic.go:358] "Generic (PLEG): container finished" podID="136aba73-6488-478f-96da-bd551f727cef" containerID="9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6" exitCode=2 Apr 24 17:28:35.156102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:35.155967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerDied","Data":"9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6"} Apr 24 17:28:36.934118 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:36.934072 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.50:8643/healthz\": dial tcp 10.134.0.50:8643: connect: connection refused" Apr 24 17:28:36.939946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:36.939914 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 24 17:28:39.168685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.168644 2573 generic.go:358] "Generic (PLEG): container finished" podID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerID="12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85" exitCode=0 Apr 24 17:28:39.169122 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.168694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerDied","Data":"12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85"} Apr 24 17:28:39.390336 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.390291 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:28:39.472617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.472569 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls\") pod \"136aba73-6488-478f-96da-bd551f727cef\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " Apr 24 17:28:39.472838 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.472633 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q2gv\" (UniqueName: \"kubernetes.io/projected/136aba73-6488-478f-96da-bd551f727cef-kube-api-access-9q2gv\") pod \"136aba73-6488-478f-96da-bd551f727cef\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " Apr 24 17:28:39.472838 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.472660 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/136aba73-6488-478f-96da-bd551f727cef-kserve-provision-location\") pod \"136aba73-6488-478f-96da-bd551f727cef\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " Apr 24 17:28:39.472838 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.472696 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/136aba73-6488-478f-96da-bd551f727cef-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"136aba73-6488-478f-96da-bd551f727cef\" (UID: \"136aba73-6488-478f-96da-bd551f727cef\") " Apr 24 17:28:39.473041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.473017 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/136aba73-6488-478f-96da-bd551f727cef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "136aba73-6488-478f-96da-bd551f727cef" (UID: "136aba73-6488-478f-96da-bd551f727cef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:28:39.473103 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.473031 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136aba73-6488-478f-96da-bd551f727cef-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "136aba73-6488-478f-96da-bd551f727cef" (UID: "136aba73-6488-478f-96da-bd551f727cef"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:28:39.474840 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.474813 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "136aba73-6488-478f-96da-bd551f727cef" (UID: "136aba73-6488-478f-96da-bd551f727cef"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:28:39.474961 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.474939 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136aba73-6488-478f-96da-bd551f727cef-kube-api-access-9q2gv" (OuterVolumeSpecName: "kube-api-access-9q2gv") pod "136aba73-6488-478f-96da-bd551f727cef" (UID: "136aba73-6488-478f-96da-bd551f727cef"). InnerVolumeSpecName "kube-api-access-9q2gv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:28:39.573885 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.573768 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/136aba73-6488-478f-96da-bd551f727cef-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:28:39.573885 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.573816 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9q2gv\" (UniqueName: \"kubernetes.io/projected/136aba73-6488-478f-96da-bd551f727cef-kube-api-access-9q2gv\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:28:39.573885 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.573830 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/136aba73-6488-478f-96da-bd551f727cef-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:28:39.573885 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:39.573843 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/136aba73-6488-478f-96da-bd551f727cef-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:28:40.173896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.173858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerStarted","Data":"9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10"} Apr 24 17:28:40.174383 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.173905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerStarted","Data":"ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb"} Apr 24 17:28:40.174383 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.174215 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:40.174383 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.174335 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:40.175733 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.175705 2573 generic.go:358] "Generic (PLEG): container finished" podID="136aba73-6488-478f-96da-bd551f727cef" containerID="39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d" exitCode=0 Apr 24 17:28:40.175846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.175737 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerDied","Data":"39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d"} Apr 24 17:28:40.175846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.175741 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:28:40.175846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.175780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" event={"ID":"136aba73-6488-478f-96da-bd551f727cef","Type":"ContainerDied","Data":"b2c0cb44733b00132467028586851e69d023305f85dc9e8b5fadddffa566afc7"} Apr 24 17:28:40.175846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.175785 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp" Apr 24 17:28:40.175846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.175799 2573 scope.go:117] "RemoveContainer" containerID="9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6" Apr 24 17:28:40.183943 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.183850 2573 scope.go:117] "RemoveContainer" containerID="39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d" Apr 24 17:28:40.191748 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.191666 2573 scope.go:117] "RemoveContainer" containerID="ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77" Apr 24 17:28:40.194187 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.194130 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podStartSLOduration=6.194114459 podStartE2EDuration="6.194114459s" podCreationTimestamp="2026-04-24 17:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:28:40.192093509 +0000 UTC m=+2978.456808908" watchObservedRunningTime="2026-04-24 17:28:40.194114459 +0000 UTC m=+2978.458829856" Apr 24 17:28:40.199990 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.199965 2573 scope.go:117] "RemoveContainer" containerID="9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6" Apr 24 17:28:40.200299 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:28:40.200280 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6\": container with ID starting with 9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6 not found: ID does not exist" containerID="9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6" Apr 24 17:28:40.200434 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.200326 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6"} err="failed to get container status \"9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6\": rpc error: code = NotFound desc = could not find container \"9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6\": container with ID starting with 9ac449a90c07716797dc6cfa2d6605abbe951fd1c985f529fa485606d1916ea6 not found: ID does not exist" Apr 24 17:28:40.200434 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.200348 2573 scope.go:117] "RemoveContainer" containerID="39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d" Apr 24 17:28:40.200651 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:28:40.200632 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d\": container with ID starting with 39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d not found: ID does not exist" containerID="39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d" Apr 24 17:28:40.200703 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.200658 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d"} err="failed to get container status \"39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d\": rpc error: code = NotFound desc = could not find container \"39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d\": container with ID starting with 39363f920339c2149a48da4d9bc57d2727982db150fe5e16e7e30f10c2dd053d not found: ID does not exist" Apr 24 17:28:40.200703 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.200677 2573 scope.go:117] "RemoveContainer" containerID="ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77" Apr 24 17:28:40.200928 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:28:40.200912 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77\": container with ID starting with ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77 not found: ID does not exist" containerID="ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77" Apr 24 17:28:40.200970 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.200932 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77"} err="failed to get container status \"ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77\": rpc error: code = NotFound desc = could not find container \"ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77\": container with ID starting with ca8cb04ef049c11a1ec531a315ca88139078d0832e52880a8d91f6ebd0c4ab77 not found: ID does not exist" Apr 24 17:28:40.204696 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.204671 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp"] Apr 24 17:28:40.208751 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.208721 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-6779b78747-8rqlp"] Apr 24 17:28:40.217104 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:40.217074 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136aba73-6488-478f-96da-bd551f727cef" path="/var/lib/kubelet/pods/136aba73-6488-478f-96da-bd551f727cef/volumes" Apr 24 17:28:41.180719 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:41.180683 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:28:46.185206 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:46.185172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:28:46.185733 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:46.185704 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:28:56.186029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:28:56.185983 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:29:02.316065 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:02.316028 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:29:02.319458 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:02.319431 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:29:06.186065 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:06.186021 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:29:16.186469 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:16.186420 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:29:26.186141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:26.186097 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:29:36.185775 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:36.185728 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:29:46.186477 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:46.186429 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:29:56.186541 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:29:56.186450 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:30:04.770353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.770293 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq"] Apr 24 17:30:04.771246 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.771186 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" containerID="cri-o://ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb" gracePeriod=30 Apr 24 17:30:04.771480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.771236 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kube-rbac-proxy" containerID="cri-o://9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10" gracePeriod=30 Apr 24 17:30:04.901684 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.901644 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl"] Apr 24 17:30:04.901972 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.901958 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kube-rbac-proxy" Apr 24 17:30:04.902017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.901977 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kube-rbac-proxy" Apr 24 17:30:04.902017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.901988 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" Apr 24 17:30:04.902017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.901994 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" Apr 24 17:30:04.902017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.902007 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="storage-initializer" Apr 24 17:30:04.902017 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.902012 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="storage-initializer" Apr 24 17:30:04.902185 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.902064 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kube-rbac-proxy" Apr 24 17:30:04.902185 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.902075 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="136aba73-6488-478f-96da-bd551f727cef" containerName="kserve-container" Apr 24 17:30:04.905124 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.905101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:04.907360 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.907335 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 24 17:30:04.907490 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.907335 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 24 17:30:04.915346 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.915298 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl"] Apr 24 17:30:04.999290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.999249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:04.999290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.999303 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:04.999558 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.999348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xldz7\" (UniqueName: \"kubernetes.io/projected/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kube-api-access-xldz7\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:04.999558 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:04.999436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.100369 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.100235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.100369 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.100346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.100617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.100379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xldz7\" (UniqueName: \"kubernetes.io/projected/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kube-api-access-xldz7\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.100617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.100441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.100617 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:30:05.100585 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: secret "isvc-tensorflow-predictor-serving-cert" not found Apr 24 17:30:05.100778 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:30:05.100667 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls podName:9985b752-3f9c-4d46-a8a0-7c63991c6ed5 nodeName:}" failed. No retries permitted until 2026-04-24 17:30:05.600642822 +0000 UTC m=+3063.865358203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-2pggl" (UID: "9985b752-3f9c-4d46-a8a0-7c63991c6ed5") : secret "isvc-tensorflow-predictor-serving-cert" not found Apr 24 17:30:05.100778 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.100712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.100997 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.100977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.110824 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.110794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xldz7\" (UniqueName: \"kubernetes.io/projected/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kube-api-access-xldz7\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.426259 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.426155 2573 generic.go:358] "Generic (PLEG): container finished" podID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerID="9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10" exitCode=2 Apr 24 17:30:05.426259 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.426242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerDied","Data":"9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10"} Apr 24 17:30:05.603712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.603675 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.606487 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.606454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-2pggl\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.816163 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.816121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:05.945514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:05.945437 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl"] Apr 24 17:30:05.949374 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:30:05.949340 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9985b752_3f9c_4d46_a8a0_7c63991c6ed5.slice/crio-52cc6aa59c1f87aa4c8124841301b2823e44dbe1480a3755c371b614931ffa2f WatchSource:0}: Error finding container 52cc6aa59c1f87aa4c8124841301b2823e44dbe1480a3755c371b614931ffa2f: Status 404 returned error can't find the container with id 52cc6aa59c1f87aa4c8124841301b2823e44dbe1480a3755c371b614931ffa2f Apr 24 17:30:06.181462 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:06.181352 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.51:8643/healthz\": dial tcp 10.134.0.51:8643: connect: connection refused" Apr 24 17:30:06.186158 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:06.186122 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 24 17:30:06.430385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:06.430342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerStarted","Data":"4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3"} Apr 24 17:30:06.430385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:06.430385 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerStarted","Data":"52cc6aa59c1f87aa4c8124841301b2823e44dbe1480a3755c371b614931ffa2f"} Apr 24 17:30:09.716446 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.716418 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:30:09.739465 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.739433 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nfvg\" (UniqueName: \"kubernetes.io/projected/ed708dff-0422-4007-ba80-c98d484dd7c8-kube-api-access-9nfvg\") pod \"ed708dff-0422-4007-ba80-c98d484dd7c8\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " Apr 24 17:30:09.739717 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.739518 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed708dff-0422-4007-ba80-c98d484dd7c8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"ed708dff-0422-4007-ba80-c98d484dd7c8\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " Apr 24 17:30:09.739717 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.739583 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed708dff-0422-4007-ba80-c98d484dd7c8-proxy-tls\") pod \"ed708dff-0422-4007-ba80-c98d484dd7c8\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " Apr 24 17:30:09.739846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.739722 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed708dff-0422-4007-ba80-c98d484dd7c8-kserve-provision-location\") pod \"ed708dff-0422-4007-ba80-c98d484dd7c8\" (UID: \"ed708dff-0422-4007-ba80-c98d484dd7c8\") " Apr 24 17:30:09.739943 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.739907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed708dff-0422-4007-ba80-c98d484dd7c8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "ed708dff-0422-4007-ba80-c98d484dd7c8" (UID: "ed708dff-0422-4007-ba80-c98d484dd7c8"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:30:09.740074 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.740055 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed708dff-0422-4007-ba80-c98d484dd7c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed708dff-0422-4007-ba80-c98d484dd7c8" (UID: "ed708dff-0422-4007-ba80-c98d484dd7c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:30:09.742140 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.742111 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed708dff-0422-4007-ba80-c98d484dd7c8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ed708dff-0422-4007-ba80-c98d484dd7c8" (UID: "ed708dff-0422-4007-ba80-c98d484dd7c8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:30:09.742289 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.742153 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed708dff-0422-4007-ba80-c98d484dd7c8-kube-api-access-9nfvg" (OuterVolumeSpecName: "kube-api-access-9nfvg") pod "ed708dff-0422-4007-ba80-c98d484dd7c8" (UID: "ed708dff-0422-4007-ba80-c98d484dd7c8"). InnerVolumeSpecName "kube-api-access-9nfvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:30:09.841058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.840966 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed708dff-0422-4007-ba80-c98d484dd7c8-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:30:09.841058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.841002 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9nfvg\" (UniqueName: \"kubernetes.io/projected/ed708dff-0422-4007-ba80-c98d484dd7c8-kube-api-access-9nfvg\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:30:09.841058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.841014 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed708dff-0422-4007-ba80-c98d484dd7c8-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:30:09.841058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:09.841026 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed708dff-0422-4007-ba80-c98d484dd7c8-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:30:10.444004 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.443955 2573 generic.go:358] "Generic (PLEG): container finished" podID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerID="ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb" exitCode=0 Apr 24 17:30:10.444226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.444033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerDied","Data":"ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb"} Apr 24 17:30:10.444226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.444079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" event={"ID":"ed708dff-0422-4007-ba80-c98d484dd7c8","Type":"ContainerDied","Data":"fe007d34577895279fd3b99112a5b5f606ba41ef435f555e13abf3203ec3814f"} Apr 24 17:30:10.444226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.444095 2573 scope.go:117] "RemoveContainer" containerID="9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10" Apr 24 17:30:10.444226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.444045 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq" Apr 24 17:30:10.452430 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.452409 2573 scope.go:117] "RemoveContainer" containerID="ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb" Apr 24 17:30:10.460476 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.460454 2573 scope.go:117] "RemoveContainer" containerID="12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85" Apr 24 17:30:10.461018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.460990 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq"] Apr 24 17:30:10.466330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.466286 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-59966d68bb-qb7sq"] Apr 24 17:30:10.468992 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.468967 2573 scope.go:117] "RemoveContainer" containerID="9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10" Apr 24 17:30:10.469299 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:30:10.469279 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10\": container with ID starting with 9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10 not found: ID does not exist" containerID="9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10" Apr 24 17:30:10.469413 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.469331 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10"} err="failed to get container status \"9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10\": rpc error: code = NotFound desc = could not find container \"9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10\": container with ID starting with 9caf9555d0836da43e9cc6153987e1ef62993393f4c6fe6daea617571c327d10 not found: ID does not exist" Apr 24 17:30:10.469413 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.469362 2573 scope.go:117] "RemoveContainer" containerID="ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb" Apr 24 17:30:10.469671 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:30:10.469650 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb\": container with ID starting with ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb not found: ID does not exist" containerID="ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb" Apr 24 17:30:10.469717 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.469677 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb"} err="failed to get container status \"ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb\": rpc error: code = NotFound desc = could not find container \"ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb\": container with ID starting with ed90d167e908c603ffc0efbef3b9ec357a5c8810b38007cf6d263823b7db4edb not found: ID does not exist" Apr 24 17:30:10.469717 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.469695 2573 scope.go:117] "RemoveContainer" containerID="12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85" Apr 24 17:30:10.469926 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:30:10.469907 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85\": container with ID starting with 12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85 not found: ID does not exist" containerID="12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85" Apr 24 17:30:10.469983 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:10.469936 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85"} err="failed to get container status \"12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85\": rpc error: code = NotFound desc = could not find container \"12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85\": container with ID starting with 12d776f50614017202cd045b198a13de24dee71a2bfe6c0f44f265d602d66c85 not found: ID does not exist" Apr 24 17:30:12.216105 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:12.216021 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" path="/var/lib/kubelet/pods/ed708dff-0422-4007-ba80-c98d484dd7c8/volumes" Apr 24 17:30:12.452223 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:12.452189 2573 generic.go:358] "Generic (PLEG): container finished" podID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerID="4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3" exitCode=0 Apr 24 17:30:12.452417 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:12.452269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerDied","Data":"4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3"} Apr 24 17:30:17.476591 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:17.476542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerStarted","Data":"b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3"} Apr 24 17:30:17.476591 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:17.476598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerStarted","Data":"4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c"} Apr 24 17:30:17.477013 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:17.476813 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:17.495524 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:17.495459 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podStartSLOduration=9.434280817 podStartE2EDuration="13.495441088s" podCreationTimestamp="2026-04-24 17:30:04 +0000 UTC" firstStartedPulling="2026-04-24 17:30:12.45345515 +0000 UTC m=+3070.718170529" lastFinishedPulling="2026-04-24 17:30:16.514615406 +0000 UTC m=+3074.779330800" observedRunningTime="2026-04-24 17:30:17.493966045 +0000 UTC m=+3075.758681444" watchObservedRunningTime="2026-04-24 17:30:17.495441088 +0000 UTC m=+3075.760156486" Apr 24 17:30:18.479077 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:18.479051 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:18.480571 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:18.480536 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 17:30:19.482005 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:19.481963 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 17:30:24.486143 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:24.486115 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:24.486786 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:24.486753 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 24 17:30:34.487612 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:34.487581 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:46.966114 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:46.966078 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl"] Apr 24 17:30:46.966777 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:46.966462 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" containerID="cri-o://4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c" gracePeriod=30 Apr 24 17:30:46.966777 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:46.966531 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" containerID="cri-o://b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3" gracePeriod=30 Apr 24 17:30:47.098204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098158 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t"] Apr 24 17:30:47.098506 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098493 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kube-rbac-proxy" Apr 24 17:30:47.098562 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098509 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kube-rbac-proxy" Apr 24 17:30:47.098562 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098522 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="storage-initializer" Apr 24 17:30:47.098562 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098528 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="storage-initializer" Apr 24 17:30:47.098562 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098543 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" Apr 24 17:30:47.098562 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098549 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" Apr 24 17:30:47.098719 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098600 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kube-rbac-proxy" Apr 24 17:30:47.098719 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.098610 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed708dff-0422-4007-ba80-c98d484dd7c8" containerName="kserve-container" Apr 24 17:30:47.101681 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.101659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.104086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.104050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:30:47.104086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.104077 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 24 17:30:47.118586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.118555 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t"] Apr 24 17:30:47.266003 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.265962 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb31c5bc-157d-4a7e-b27d-05554483dfa9-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.266225 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.266040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvgb7\" (UniqueName: \"kubernetes.io/projected/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kube-api-access-dvgb7\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.266225 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.266070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.266225 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.266091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb31c5bc-157d-4a7e-b27d-05554483dfa9-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.367638 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.367591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb31c5bc-157d-4a7e-b27d-05554483dfa9-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.367851 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.367669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvgb7\" (UniqueName: \"kubernetes.io/projected/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kube-api-access-dvgb7\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.367851 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.367703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.367851 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.367731 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb31c5bc-157d-4a7e-b27d-05554483dfa9-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.368146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.368121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.368416 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.368394 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb31c5bc-157d-4a7e-b27d-05554483dfa9-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.370571 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.370552 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb31c5bc-157d-4a7e-b27d-05554483dfa9-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.376876 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.376834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvgb7\" (UniqueName: \"kubernetes.io/projected/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kube-api-access-dvgb7\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.412923 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.412884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:47.542854 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.542769 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t"] Apr 24 17:30:47.546041 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:30:47.546005 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb31c5bc_157d_4a7e_b27d_05554483dfa9.slice/crio-cff5d08690fa7de65843a1087ec5d64b106081935f50a92b4f3b790dd305f8ab WatchSource:0}: Error finding container cff5d08690fa7de65843a1087ec5d64b106081935f50a92b4f3b790dd305f8ab: Status 404 returned error can't find the container with id cff5d08690fa7de65843a1087ec5d64b106081935f50a92b4f3b790dd305f8ab Apr 24 17:30:47.562700 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.562655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerStarted","Data":"cff5d08690fa7de65843a1087ec5d64b106081935f50a92b4f3b790dd305f8ab"} Apr 24 17:30:47.564727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.564687 2573 generic.go:358] "Generic (PLEG): container finished" podID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerID="b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3" exitCode=2 Apr 24 17:30:47.564976 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:47.564765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerDied","Data":"b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3"} Apr 24 17:30:48.568739 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:48.568695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerStarted","Data":"2ac269d7d6445132b5b05fe6672d7f588c5eefe7a720477e1aafebbaf44eb18a"} Apr 24 17:30:49.483088 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:49.483036 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 24 17:30:52.587644 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:52.587598 2573 generic.go:358] "Generic (PLEG): container finished" podID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerID="2ac269d7d6445132b5b05fe6672d7f588c5eefe7a720477e1aafebbaf44eb18a" exitCode=0 Apr 24 17:30:52.588049 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:52.587671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerDied","Data":"2ac269d7d6445132b5b05fe6672d7f588c5eefe7a720477e1aafebbaf44eb18a"} Apr 24 17:30:53.592857 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:53.592816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerStarted","Data":"97083178d9d5b94fcf3b3e8a04bf311252ebc2a4f2f8049607996a2a34fe57c1"} Apr 24 17:30:53.592857 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:53.592862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerStarted","Data":"7eb3b658fa84de453060cc68133c90061908f58387bf24887b49da3add4eedb8"} Apr 24 17:30:53.593340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:53.593192 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:53.593340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:53.593296 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:53.594671 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:53.594639 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 17:30:53.611617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:53.611563 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podStartSLOduration=6.611543234 podStartE2EDuration="6.611543234s" podCreationTimestamp="2026-04-24 17:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:30:53.610557204 +0000 UTC m=+3111.875272605" watchObservedRunningTime="2026-04-24 17:30:53.611543234 +0000 UTC m=+3111.876258635" Apr 24 17:30:54.483292 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:54.483246 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 24 17:30:54.596325 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:54.596274 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 17:30:59.482656 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:59.482604 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 24 17:30:59.483070 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:59.482777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:30:59.601283 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:59.601242 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:30:59.601875 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:30:59.601844 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 17:31:04.482574 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:04.482521 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 24 17:31:09.482851 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:09.482799 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 24 17:31:09.602147 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:09.602115 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:31:14.482821 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:14.482771 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.52:8643/healthz\": dial tcp 10.134.0.52:8643: connect: connection refused" Apr 24 17:31:17.614592 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.614554 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:31:17.660864 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.660822 2573 generic.go:358] "Generic (PLEG): container finished" podID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerID="4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c" exitCode=137 Apr 24 17:31:17.661063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.660886 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerDied","Data":"4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c"} Apr 24 17:31:17.661063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.660910 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" Apr 24 17:31:17.661063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.660927 2573 scope.go:117] "RemoveContainer" containerID="b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3" Apr 24 17:31:17.661063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.660915 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl" event={"ID":"9985b752-3f9c-4d46-a8a0-7c63991c6ed5","Type":"ContainerDied","Data":"52cc6aa59c1f87aa4c8124841301b2823e44dbe1480a3755c371b614931ffa2f"} Apr 24 17:31:17.668913 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.668886 2573 scope.go:117] "RemoveContainer" containerID="4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c" Apr 24 17:31:17.677219 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.677195 2573 scope.go:117] "RemoveContainer" containerID="4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3" Apr 24 17:31:17.685596 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.685570 2573 scope.go:117] "RemoveContainer" containerID="b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3" Apr 24 17:31:17.685906 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:31:17.685885 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3\": container with ID starting with b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3 not found: ID does not exist" containerID="b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3" Apr 24 17:31:17.685954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.685920 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3"} err="failed to get container status \"b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3\": rpc error: code = NotFound desc = could not find container \"b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3\": container with ID starting with b479c10c35bd56b9badedd4e361112115d246b5f16ea6489015cd4d183c24ec3 not found: ID does not exist" Apr 24 17:31:17.685954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.685942 2573 scope.go:117] "RemoveContainer" containerID="4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c" Apr 24 17:31:17.686215 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:31:17.686196 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c\": container with ID starting with 4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c not found: ID does not exist" containerID="4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c" Apr 24 17:31:17.686269 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.686222 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c"} err="failed to get container status \"4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c\": rpc error: code = NotFound desc = could not find container \"4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c\": container with ID starting with 4add17fd0f452e36ca4dce622c597d57aea1b943f079b05be2f339fd8dd6da1c not found: ID does not exist" Apr 24 17:31:17.686269 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.686244 2573 scope.go:117] "RemoveContainer" containerID="4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3" Apr 24 17:31:17.686497 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:31:17.686480 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3\": container with ID starting with 4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3 not found: ID does not exist" containerID="4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3" Apr 24 17:31:17.686542 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.686501 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3"} err="failed to get container status \"4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3\": rpc error: code = NotFound desc = could not find container \"4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3\": container with ID starting with 4a06bdfa6d699076b831539a93babf5c96a6dc2a94ae1e06e3a0a470efdd28c3 not found: ID does not exist" Apr 24 17:31:17.710052 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.710015 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xldz7\" (UniqueName: \"kubernetes.io/projected/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kube-api-access-xldz7\") pod \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " Apr 24 17:31:17.710199 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.710065 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kserve-provision-location\") pod \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " Apr 24 17:31:17.710199 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.710141 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " Apr 24 17:31:17.710284 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.710203 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls\") pod \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\" (UID: \"9985b752-3f9c-4d46-a8a0-7c63991c6ed5\") " Apr 24 17:31:17.710622 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.710586 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "9985b752-3f9c-4d46-a8a0-7c63991c6ed5" (UID: "9985b752-3f9c-4d46-a8a0-7c63991c6ed5"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:31:17.712486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.712454 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kube-api-access-xldz7" (OuterVolumeSpecName: "kube-api-access-xldz7") pod "9985b752-3f9c-4d46-a8a0-7c63991c6ed5" (UID: "9985b752-3f9c-4d46-a8a0-7c63991c6ed5"). InnerVolumeSpecName "kube-api-access-xldz7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:31:17.712608 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.712522 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9985b752-3f9c-4d46-a8a0-7c63991c6ed5" (UID: "9985b752-3f9c-4d46-a8a0-7c63991c6ed5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:31:17.727263 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.727218 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9985b752-3f9c-4d46-a8a0-7c63991c6ed5" (UID: "9985b752-3f9c-4d46-a8a0-7c63991c6ed5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:31:17.811385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.811342 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:31:17.811385 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.811379 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:31:17.811616 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.811399 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:31:17.811616 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.811413 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xldz7\" (UniqueName: \"kubernetes.io/projected/9985b752-3f9c-4d46-a8a0-7c63991c6ed5-kube-api-access-xldz7\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:31:17.983546 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.983512 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl"] Apr 24 17:31:17.987011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:17.986981 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-2pggl"] Apr 24 17:31:18.216213 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:18.216132 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" path="/var/lib/kubelet/pods/9985b752-3f9c-4d46-a8a0-7c63991c6ed5/volumes" Apr 24 17:31:29.165872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.165783 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t"] Apr 24 17:31:29.166336 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.166153 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" containerID="cri-o://7eb3b658fa84de453060cc68133c90061908f58387bf24887b49da3add4eedb8" gracePeriod=30 Apr 24 17:31:29.166336 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.166163 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" containerID="cri-o://97083178d9d5b94fcf3b3e8a04bf311252ebc2a4f2f8049607996a2a34fe57c1" gracePeriod=30 Apr 24 17:31:29.258160 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258118 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz"] Apr 24 17:31:29.258447 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258433 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="storage-initializer" Apr 24 17:31:29.258505 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258448 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="storage-initializer" Apr 24 17:31:29.258505 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258458 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" Apr 24 17:31:29.258505 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258464 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" Apr 24 17:31:29.258505 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258480 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" Apr 24 17:31:29.258505 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258486 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" Apr 24 17:31:29.258659 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258539 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kube-rbac-proxy" Apr 24 17:31:29.258659 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.258549 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9985b752-3f9c-4d46-a8a0-7c63991c6ed5" containerName="kserve-container" Apr 24 17:31:29.263238 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.263218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.265459 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.265435 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 24 17:31:29.265559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.265435 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 24 17:31:29.272513 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.271581 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz"] Apr 24 17:31:29.309704 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.309664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01cd39d3-557c-4864-a09a-03f5bd545ff5-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.309921 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.309722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01cd39d3-557c-4864-a09a-03f5bd545ff5-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.309921 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.309746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qks9t\" (UniqueName: \"kubernetes.io/projected/01cd39d3-557c-4864-a09a-03f5bd545ff5-kube-api-access-qks9t\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.309921 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.309848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.410657 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.410606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01cd39d3-557c-4864-a09a-03f5bd545ff5-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.410872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.410667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01cd39d3-557c-4864-a09a-03f5bd545ff5-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.410872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.410693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qks9t\" (UniqueName: \"kubernetes.io/projected/01cd39d3-557c-4864-a09a-03f5bd545ff5-kube-api-access-qks9t\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.410872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.410747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.410872 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:31:29.410864 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-triton-predictor-serving-cert: secret "isvc-triton-predictor-serving-cert" not found Apr 24 17:31:29.411079 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:31:29.410938 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls podName:01cd39d3-557c-4864-a09a-03f5bd545ff5 nodeName:}" failed. No retries permitted until 2026-04-24 17:31:29.91091628 +0000 UTC m=+3148.175631672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls") pod "isvc-triton-predictor-84bb65d94b-swskz" (UID: "01cd39d3-557c-4864-a09a-03f5bd545ff5") : secret "isvc-triton-predictor-serving-cert" not found Apr 24 17:31:29.411079 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.411041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01cd39d3-557c-4864-a09a-03f5bd545ff5-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.411370 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.411349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01cd39d3-557c-4864-a09a-03f5bd545ff5-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.427166 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.427091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qks9t\" (UniqueName: \"kubernetes.io/projected/01cd39d3-557c-4864-a09a-03f5bd545ff5-kube-api-access-qks9t\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.597365 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.597303 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:29.699367 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.699247 2573 generic.go:358] "Generic (PLEG): container finished" podID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerID="97083178d9d5b94fcf3b3e8a04bf311252ebc2a4f2f8049607996a2a34fe57c1" exitCode=2 Apr 24 17:31:29.699367 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.699344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerDied","Data":"97083178d9d5b94fcf3b3e8a04bf311252ebc2a4f2f8049607996a2a34fe57c1"} Apr 24 17:31:29.914357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.914287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:29.916959 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:29.916931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-swskz\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:30.175811 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:30.175768 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:31:30.304914 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:30.304884 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz"] Apr 24 17:31:30.307654 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:31:30.307620 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01cd39d3_557c_4864_a09a_03f5bd545ff5.slice/crio-da75756297fa99d455d310ec4c5911116b37b6f7b2bc6f95b1c20dfb736edb3f WatchSource:0}: Error finding container da75756297fa99d455d310ec4c5911116b37b6f7b2bc6f95b1c20dfb736edb3f: Status 404 returned error can't find the container with id da75756297fa99d455d310ec4c5911116b37b6f7b2bc6f95b1c20dfb736edb3f Apr 24 17:31:30.703230 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:30.703185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerStarted","Data":"df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642"} Apr 24 17:31:30.703230 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:30.703238 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerStarted","Data":"da75756297fa99d455d310ec4c5911116b37b6f7b2bc6f95b1c20dfb736edb3f"} Apr 24 17:31:34.597186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:34.597137 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:34.715246 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:34.715207 2573 generic.go:358] "Generic (PLEG): container finished" podID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerID="df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642" exitCode=0 Apr 24 17:31:34.715444 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:34.715278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerDied","Data":"df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642"} Apr 24 17:31:39.597914 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:39.597218 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:39.597914 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:39.597394 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:31:44.597056 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:44.597004 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:49.597514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:49.597463 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:54.596897 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:54.596832 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:59.597537 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.597481 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.53:8643/healthz\": dial tcp 10.134.0.53:8643: connect: connection refused" Apr 24 17:31:59.603154 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.603102 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 24 17:31:59.826834 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.826669 2573 generic.go:358] "Generic (PLEG): container finished" podID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerID="7eb3b658fa84de453060cc68133c90061908f58387bf24887b49da3add4eedb8" exitCode=137 Apr 24 17:31:59.826834 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.826775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerDied","Data":"7eb3b658fa84de453060cc68133c90061908f58387bf24887b49da3add4eedb8"} Apr 24 17:31:59.866240 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.866208 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:31:59.906631 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.906555 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kserve-provision-location\") pod \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " Apr 24 17:31:59.906631 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.906638 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb31c5bc-157d-4a7e-b27d-05554483dfa9-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " Apr 24 17:31:59.906890 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.906745 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvgb7\" (UniqueName: \"kubernetes.io/projected/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kube-api-access-dvgb7\") pod \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " Apr 24 17:31:59.906890 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.906781 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb31c5bc-157d-4a7e-b27d-05554483dfa9-proxy-tls\") pod \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\" (UID: \"cb31c5bc-157d-4a7e-b27d-05554483dfa9\") " Apr 24 17:31:59.907915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.907856 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb31c5bc-157d-4a7e-b27d-05554483dfa9-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "cb31c5bc-157d-4a7e-b27d-05554483dfa9" (UID: "cb31c5bc-157d-4a7e-b27d-05554483dfa9"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:31:59.911261 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.911222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb31c5bc-157d-4a7e-b27d-05554483dfa9" (UID: "cb31c5bc-157d-4a7e-b27d-05554483dfa9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:31:59.912271 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.912238 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb31c5bc-157d-4a7e-b27d-05554483dfa9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cb31c5bc-157d-4a7e-b27d-05554483dfa9" (UID: "cb31c5bc-157d-4a7e-b27d-05554483dfa9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:31:59.912852 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:31:59.912752 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kube-api-access-dvgb7" (OuterVolumeSpecName: "kube-api-access-dvgb7") pod "cb31c5bc-157d-4a7e-b27d-05554483dfa9" (UID: "cb31c5bc-157d-4a7e-b27d-05554483dfa9"). InnerVolumeSpecName "kube-api-access-dvgb7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:32:00.007585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.007551 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvgb7\" (UniqueName: \"kubernetes.io/projected/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kube-api-access-dvgb7\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:32:00.007585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.007583 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cb31c5bc-157d-4a7e-b27d-05554483dfa9-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:32:00.007585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.007595 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb31c5bc-157d-4a7e-b27d-05554483dfa9-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:32:00.008049 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.007607 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/cb31c5bc-157d-4a7e-b27d-05554483dfa9-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:32:00.832760 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.832715 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" event={"ID":"cb31c5bc-157d-4a7e-b27d-05554483dfa9","Type":"ContainerDied","Data":"cff5d08690fa7de65843a1087ec5d64b106081935f50a92b4f3b790dd305f8ab"} Apr 24 17:32:00.832760 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.832758 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t" Apr 24 17:32:00.833786 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.832776 2573 scope.go:117] "RemoveContainer" containerID="97083178d9d5b94fcf3b3e8a04bf311252ebc2a4f2f8049607996a2a34fe57c1" Apr 24 17:32:00.843754 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.843664 2573 scope.go:117] "RemoveContainer" containerID="7eb3b658fa84de453060cc68133c90061908f58387bf24887b49da3add4eedb8" Apr 24 17:32:00.854602 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.854572 2573 scope.go:117] "RemoveContainer" containerID="2ac269d7d6445132b5b05fe6672d7f588c5eefe7a720477e1aafebbaf44eb18a" Apr 24 17:32:00.854726 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.854692 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t"] Apr 24 17:32:00.856188 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:00.856154 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-f4t7t"] Apr 24 17:32:02.219010 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:32:02.218595 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" path="/var/lib/kubelet/pods/cb31c5bc-157d-4a7e-b27d-05554483dfa9/volumes" Apr 24 17:33:30.119745 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:30.119704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerStarted","Data":"a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433"} Apr 24 17:33:30.119745 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:30.119751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerStarted","Data":"10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb"} Apr 24 17:33:30.120228 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:30.119849 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:33:30.146412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:30.146353 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" podStartSLOduration=6.4707774 podStartE2EDuration="2m1.146336963s" podCreationTimestamp="2026-04-24 17:31:29 +0000 UTC" firstStartedPulling="2026-04-24 17:31:34.716474715 +0000 UTC m=+3152.981190091" lastFinishedPulling="2026-04-24 17:33:29.392034278 +0000 UTC m=+3267.656749654" observedRunningTime="2026-04-24 17:33:30.144741443 +0000 UTC m=+3268.409456841" watchObservedRunningTime="2026-04-24 17:33:30.146336963 +0000 UTC m=+3268.411052358" Apr 24 17:33:31.122817 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:31.122778 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:33:31.124074 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:31.124045 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 17:33:32.125182 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:32.125132 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 24 17:33:37.129810 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:37.129777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:33:37.130763 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:37.130740 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:33:42.052772 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.052726 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz"] Apr 24 17:33:42.053229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.053044 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kserve-container" containerID="cri-o://10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb" gracePeriod=30 Apr 24 17:33:42.053229 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.053120 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kube-rbac-proxy" containerID="cri-o://a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433" gracePeriod=30 Apr 24 17:33:42.125950 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.125904 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.54:8643/healthz\": dial tcp 10.134.0.54:8643: connect: connection refused" Apr 24 17:33:42.198623 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.198578 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp"] Apr 24 17:33:42.198957 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.198938 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" Apr 24 17:33:42.199040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.198960 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" Apr 24 17:33:42.199040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.198975 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" Apr 24 17:33:42.199040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.198982 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" Apr 24 17:33:42.199040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.198999 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="storage-initializer" Apr 24 17:33:42.199040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.199008 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="storage-initializer" Apr 24 17:33:42.199333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.199092 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kube-rbac-proxy" Apr 24 17:33:42.199333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.199107 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb31c5bc-157d-4a7e-b27d-05554483dfa9" containerName="kserve-container" Apr 24 17:33:42.209319 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.209284 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.211767 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.211733 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 17:33:42.212080 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.212055 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 24 17:33:42.217064 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.217035 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp"] Apr 24 17:33:42.333846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.333746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lnwl\" (UniqueName: \"kubernetes.io/projected/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kube-api-access-4lnwl\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.333846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.333805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.333846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.333831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.334079 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.333888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.435128 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.435088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.435366 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.435137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lnwl\" (UniqueName: \"kubernetes.io/projected/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kube-api-access-4lnwl\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.435366 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.435174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.435366 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.435194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.435558 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:33:42.435358 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-predictor-serving-cert: secret "isvc-xgboost-predictor-serving-cert" not found Apr 24 17:33:42.435558 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:33:42.435448 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls podName:5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c nodeName:}" failed. No retries permitted until 2026-04-24 17:33:42.935424577 +0000 UTC m=+3281.200139966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls") pod "isvc-xgboost-predictor-8689c4cfcc-sz9kp" (UID: "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c") : secret "isvc-xgboost-predictor-serving-cert" not found Apr 24 17:33:42.435647 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.435549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.435842 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.435818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.444718 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.444681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lnwl\" (UniqueName: \"kubernetes.io/projected/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kube-api-access-4lnwl\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.940526 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.940485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:42.943291 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:42.943251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-sz9kp\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:43.122044 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:43.121991 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:33:43.158174 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:43.158138 2573 generic.go:358] "Generic (PLEG): container finished" podID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerID="a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433" exitCode=2 Apr 24 17:33:43.158379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:43.158228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerDied","Data":"a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433"} Apr 24 17:33:43.286004 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:43.285977 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp"] Apr 24 17:33:43.289250 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:33:43.289213 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e0a18a3_fbc1_4e41_bb69_8e1781bb9c6c.slice/crio-1936a05b95bcc58b4fa753ecdb32a70a3e8ca04f631636921934f293defb22f6 WatchSource:0}: Error finding container 1936a05b95bcc58b4fa753ecdb32a70a3e8ca04f631636921934f293defb22f6: Status 404 returned error can't find the container with id 1936a05b95bcc58b4fa753ecdb32a70a3e8ca04f631636921934f293defb22f6 Apr 24 17:33:43.291612 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:43.291592 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:33:44.163408 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.163356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerStarted","Data":"36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e"} Apr 24 17:33:44.163408 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.163403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerStarted","Data":"1936a05b95bcc58b4fa753ecdb32a70a3e8ca04f631636921934f293defb22f6"} Apr 24 17:33:44.414221 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.414134 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:33:44.552916 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.552875 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01cd39d3-557c-4864-a09a-03f5bd545ff5-isvc-triton-kube-rbac-proxy-sar-config\") pod \"01cd39d3-557c-4864-a09a-03f5bd545ff5\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " Apr 24 17:33:44.552916 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.552917 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls\") pod \"01cd39d3-557c-4864-a09a-03f5bd545ff5\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " Apr 24 17:33:44.553154 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.552948 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01cd39d3-557c-4864-a09a-03f5bd545ff5-kserve-provision-location\") pod \"01cd39d3-557c-4864-a09a-03f5bd545ff5\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " Apr 24 17:33:44.553154 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.552993 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qks9t\" (UniqueName: \"kubernetes.io/projected/01cd39d3-557c-4864-a09a-03f5bd545ff5-kube-api-access-qks9t\") pod \"01cd39d3-557c-4864-a09a-03f5bd545ff5\" (UID: \"01cd39d3-557c-4864-a09a-03f5bd545ff5\") " Apr 24 17:33:44.553397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.553360 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01cd39d3-557c-4864-a09a-03f5bd545ff5-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "01cd39d3-557c-4864-a09a-03f5bd545ff5" (UID: "01cd39d3-557c-4864-a09a-03f5bd545ff5"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:33:44.553528 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.553423 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01cd39d3-557c-4864-a09a-03f5bd545ff5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01cd39d3-557c-4864-a09a-03f5bd545ff5" (UID: "01cd39d3-557c-4864-a09a-03f5bd545ff5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:33:44.555390 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.555359 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01cd39d3-557c-4864-a09a-03f5bd545ff5-kube-api-access-qks9t" (OuterVolumeSpecName: "kube-api-access-qks9t") pod "01cd39d3-557c-4864-a09a-03f5bd545ff5" (UID: "01cd39d3-557c-4864-a09a-03f5bd545ff5"). InnerVolumeSpecName "kube-api-access-qks9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:33:44.555483 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.555378 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "01cd39d3-557c-4864-a09a-03f5bd545ff5" (UID: "01cd39d3-557c-4864-a09a-03f5bd545ff5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:33:44.654394 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.654347 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/01cd39d3-557c-4864-a09a-03f5bd545ff5-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:33:44.654394 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.654388 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/01cd39d3-557c-4864-a09a-03f5bd545ff5-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:33:44.654394 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.654404 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01cd39d3-557c-4864-a09a-03f5bd545ff5-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:33:44.654646 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:44.654417 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qks9t\" (UniqueName: \"kubernetes.io/projected/01cd39d3-557c-4864-a09a-03f5bd545ff5-kube-api-access-qks9t\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:33:45.167404 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.167366 2573 generic.go:358] "Generic (PLEG): container finished" podID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerID="10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb" exitCode=0 Apr 24 17:33:45.167915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.167455 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerDied","Data":"10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb"} Apr 24 17:33:45.167915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.167492 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" event={"ID":"01cd39d3-557c-4864-a09a-03f5bd545ff5","Type":"ContainerDied","Data":"da75756297fa99d455d310ec4c5911116b37b6f7b2bc6f95b1c20dfb736edb3f"} Apr 24 17:33:45.167915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.167492 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz" Apr 24 17:33:45.167915 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.167511 2573 scope.go:117] "RemoveContainer" containerID="a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433" Apr 24 17:33:45.176484 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.176461 2573 scope.go:117] "RemoveContainer" containerID="10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb" Apr 24 17:33:45.184612 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.184587 2573 scope.go:117] "RemoveContainer" containerID="df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642" Apr 24 17:33:45.190378 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.190343 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz"] Apr 24 17:33:45.193215 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.193194 2573 scope.go:117] "RemoveContainer" containerID="a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433" Apr 24 17:33:45.193611 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:33:45.193588 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433\": container with ID starting with a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433 not found: ID does not exist" containerID="a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433" Apr 24 17:33:45.193705 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.193620 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433"} err="failed to get container status \"a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433\": rpc error: code = NotFound desc = could not find container \"a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433\": container with ID starting with a1708926e0ca8a461e5993a0ac160322bfa3e5073a6738b5f5ba4c0bf5143433 not found: ID does not exist" Apr 24 17:33:45.193705 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.193643 2573 scope.go:117] "RemoveContainer" containerID="10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb" Apr 24 17:33:45.193956 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:33:45.193939 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb\": container with ID starting with 10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb not found: ID does not exist" containerID="10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb" Apr 24 17:33:45.194006 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.193962 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb"} err="failed to get container status \"10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb\": rpc error: code = NotFound desc = could not find container \"10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb\": container with ID starting with 10da4dd5ba0cd527de0288a50c80f6c749b1b08c966c9fbb6f197046837d4fbb not found: ID does not exist" Apr 24 17:33:45.194006 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.193981 2573 scope.go:117] "RemoveContainer" containerID="df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642" Apr 24 17:33:45.194006 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.193984 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-swskz"] Apr 24 17:33:45.194256 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:33:45.194240 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642\": container with ID starting with df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642 not found: ID does not exist" containerID="df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642" Apr 24 17:33:45.194331 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:45.194262 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642"} err="failed to get container status \"df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642\": rpc error: code = NotFound desc = could not find container \"df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642\": container with ID starting with df5c411fd62939404c8872102c9c6f12ad31f34b9c6c312c2982df4de9efd642 not found: ID does not exist" Apr 24 17:33:46.216234 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:46.216195 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" path="/var/lib/kubelet/pods/01cd39d3-557c-4864-a09a-03f5bd545ff5/volumes" Apr 24 17:33:48.179269 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:48.179232 2573 generic.go:358] "Generic (PLEG): container finished" podID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerID="36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e" exitCode=0 Apr 24 17:33:48.179784 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:33:48.179329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerDied","Data":"36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e"} Apr 24 17:34:07.742411 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:07.742382 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:34:07.742911 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:07.742382 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:34:08.245848 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:08.245819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerStarted","Data":"2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34"} Apr 24 17:34:09.250807 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:09.250766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerStarted","Data":"88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416"} Apr 24 17:34:09.251280 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:09.250959 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:34:09.251280 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:09.251097 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:34:09.252423 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:09.252390 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:34:09.268859 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:09.268785 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podStartSLOduration=7.325299861 podStartE2EDuration="27.268769196s" podCreationTimestamp="2026-04-24 17:33:42 +0000 UTC" firstStartedPulling="2026-04-24 17:33:48.180702698 +0000 UTC m=+3286.445418074" lastFinishedPulling="2026-04-24 17:34:08.124172029 +0000 UTC m=+3306.388887409" observedRunningTime="2026-04-24 17:34:09.267674576 +0000 UTC m=+3307.532389998" watchObservedRunningTime="2026-04-24 17:34:09.268769196 +0000 UTC m=+3307.533484594" Apr 24 17:34:10.254298 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:10.254256 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:34:15.259263 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:15.259228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:34:15.259888 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:15.259860 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:34:25.260326 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:25.260213 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:34:35.259956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:35.259912 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:34:45.260265 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:45.260216 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:34:55.259865 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:34:55.259825 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:35:05.260869 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:05.260823 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:35:15.261137 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:15.261104 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:35:22.128584 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.128551 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp"] Apr 24 17:35:22.128991 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.128895 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" containerID="cri-o://2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34" gracePeriod=30 Apr 24 17:35:22.129054 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.128964 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kube-rbac-proxy" containerID="cri-o://88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416" gracePeriod=30 Apr 24 17:35:22.244070 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244038 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f"] Apr 24 17:35:22.244426 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244408 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kserve-container" Apr 24 17:35:22.244480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244430 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kserve-container" Apr 24 17:35:22.244480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244446 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kube-rbac-proxy" Apr 24 17:35:22.244480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244452 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kube-rbac-proxy" Apr 24 17:35:22.244480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244469 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="storage-initializer" Apr 24 17:35:22.244480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244475 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="storage-initializer" Apr 24 17:35:22.244643 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244532 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kserve-container" Apr 24 17:35:22.244643 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.244540 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="01cd39d3-557c-4864-a09a-03f5bd545ff5" containerName="kube-rbac-proxy" Apr 24 17:35:22.247645 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.247625 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.249849 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.249825 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 17:35:22.250001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.249880 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 17:35:22.258455 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.258421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f"] Apr 24 17:35:22.290412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.290366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448gd\" (UniqueName: \"kubernetes.io/projected/a9e3c445-9972-49f9-92af-255429d02bd7-kube-api-access-448gd\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.290600 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.290459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9e3c445-9972-49f9-92af-255429d02bd7-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.290600 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.290497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e3c445-9972-49f9-92af-255429d02bd7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.290600 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.290536 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9e3c445-9972-49f9-92af-255429d02bd7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.391437 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.391330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-448gd\" (UniqueName: \"kubernetes.io/projected/a9e3c445-9972-49f9-92af-255429d02bd7-kube-api-access-448gd\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.391437 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.391401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9e3c445-9972-49f9-92af-255429d02bd7-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.391437 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.391430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e3c445-9972-49f9-92af-255429d02bd7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.391680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.391448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9e3c445-9972-49f9-92af-255429d02bd7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.391859 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.391840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9e3c445-9972-49f9-92af-255429d02bd7-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.392226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.392197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9e3c445-9972-49f9-92af-255429d02bd7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.394325 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.394289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e3c445-9972-49f9-92af-255429d02bd7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.399270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.399243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-448gd\" (UniqueName: \"kubernetes.io/projected/a9e3c445-9972-49f9-92af-255429d02bd7-kube-api-access-448gd\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.476223 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.476183 2573 generic.go:358] "Generic (PLEG): container finished" podID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerID="88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416" exitCode=2 Apr 24 17:35:22.476422 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.476246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerDied","Data":"88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416"} Apr 24 17:35:22.558751 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.558701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:22.688815 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:22.688741 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f"] Apr 24 17:35:22.691610 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:35:22.691578 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e3c445_9972_49f9_92af_255429d02bd7.slice/crio-faf0db3d2b3baceef1f56180d11afefcdd6017bf1dbaae8d26e76e65a0ee341e WatchSource:0}: Error finding container faf0db3d2b3baceef1f56180d11afefcdd6017bf1dbaae8d26e76e65a0ee341e: Status 404 returned error can't find the container with id faf0db3d2b3baceef1f56180d11afefcdd6017bf1dbaae8d26e76e65a0ee341e Apr 24 17:35:23.479975 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:23.479931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerStarted","Data":"a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a"} Apr 24 17:35:23.479975 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:23.479982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerStarted","Data":"faf0db3d2b3baceef1f56180d11afefcdd6017bf1dbaae8d26e76e65a0ee341e"} Apr 24 17:35:25.255123 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:25.255072 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.55:8643/healthz\": dial tcp 10.134.0.55:8643: connect: connection refused" Apr 24 17:35:25.260399 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:25.260360 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 24 17:35:26.274187 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.274155 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:35:26.319873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.319769 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls\") pod \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " Apr 24 17:35:26.319873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.319829 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kserve-provision-location\") pod \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " Apr 24 17:35:26.320129 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.319938 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lnwl\" (UniqueName: \"kubernetes.io/projected/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kube-api-access-4lnwl\") pod \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " Apr 24 17:35:26.320129 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.319973 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\" (UID: \"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c\") " Apr 24 17:35:26.320252 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.320165 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" (UID: "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:35:26.320485 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.320455 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" (UID: "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:35:26.322264 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.322234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" (UID: "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:35:26.322264 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.322253 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kube-api-access-4lnwl" (OuterVolumeSpecName: "kube-api-access-4lnwl") pod "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" (UID: "5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c"). InnerVolumeSpecName "kube-api-access-4lnwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:35:26.420912 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.420860 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lnwl\" (UniqueName: \"kubernetes.io/projected/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kube-api-access-4lnwl\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:35:26.420912 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.420906 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:35:26.420912 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.420919 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:35:26.421265 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.420929 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:35:26.490249 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.490204 2573 generic.go:358] "Generic (PLEG): container finished" podID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerID="2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34" exitCode=0 Apr 24 17:35:26.490451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.490292 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" Apr 24 17:35:26.490451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.490292 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerDied","Data":"2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34"} Apr 24 17:35:26.490451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.490361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp" event={"ID":"5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c","Type":"ContainerDied","Data":"1936a05b95bcc58b4fa753ecdb32a70a3e8ca04f631636921934f293defb22f6"} Apr 24 17:35:26.490451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.490378 2573 scope.go:117] "RemoveContainer" containerID="88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416" Apr 24 17:35:26.499423 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.499400 2573 scope.go:117] "RemoveContainer" containerID="2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34" Apr 24 17:35:26.509242 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.508469 2573 scope.go:117] "RemoveContainer" containerID="36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e" Apr 24 17:35:26.517226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.517189 2573 scope.go:117] "RemoveContainer" containerID="88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416" Apr 24 17:35:26.517592 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:35:26.517550 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416\": container with ID starting with 88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416 not found: ID does not exist" containerID="88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416" Apr 24 17:35:26.517701 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.517590 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416"} err="failed to get container status \"88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416\": rpc error: code = NotFound desc = could not find container \"88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416\": container with ID starting with 88592d93a488d16d61597aab0b5ace9e1a4e34918b603aa27760393e5039c416 not found: ID does not exist" Apr 24 17:35:26.517701 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.517616 2573 scope.go:117] "RemoveContainer" containerID="2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34" Apr 24 17:35:26.518164 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:35:26.518128 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34\": container with ID starting with 2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34 not found: ID does not exist" containerID="2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34" Apr 24 17:35:26.518274 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.518173 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34"} err="failed to get container status \"2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34\": rpc error: code = NotFound desc = could not find container \"2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34\": container with ID starting with 2af521ddedd12865a9133f8e537afbd87d5f99ef273f6db01342a491b4ae6b34 not found: ID does not exist" Apr 24 17:35:26.518274 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.518197 2573 scope.go:117] "RemoveContainer" containerID="36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e" Apr 24 17:35:26.518975 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:35:26.518950 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e\": container with ID starting with 36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e not found: ID does not exist" containerID="36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e" Apr 24 17:35:26.519060 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.518982 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e"} err="failed to get container status \"36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e\": rpc error: code = NotFound desc = could not find container \"36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e\": container with ID starting with 36e62cc3add3cb4ae6c5bf0371599103d353e37f07a6468a783077c190b6259e not found: ID does not exist" Apr 24 17:35:26.519575 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.519553 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp"] Apr 24 17:35:26.524186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:26.524162 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-sz9kp"] Apr 24 17:35:27.494946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:27.494904 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9e3c445-9972-49f9-92af-255429d02bd7" containerID="a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a" exitCode=0 Apr 24 17:35:27.495353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:27.494977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerDied","Data":"a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a"} Apr 24 17:35:28.217330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:28.217263 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" path="/var/lib/kubelet/pods/5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c/volumes" Apr 24 17:35:28.499460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:28.499421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerStarted","Data":"da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d"} Apr 24 17:35:28.499460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:28.499463 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerStarted","Data":"df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c"} Apr 24 17:35:28.499889 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:28.499678 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:28.525437 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:28.524679 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" podStartSLOduration=6.524652579 podStartE2EDuration="6.524652579s" podCreationTimestamp="2026-04-24 17:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:35:28.52122892 +0000 UTC m=+3386.785944319" watchObservedRunningTime="2026-04-24 17:35:28.524652579 +0000 UTC m=+3386.789367978" Apr 24 17:35:29.503055 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:29.503016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:35:35.513162 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:35:35.513134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:36:05.528630 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:05.528538 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:36:12.288344 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.288283 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f"] Apr 24 17:36:12.288752 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.288663 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kserve-container" containerID="cri-o://df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c" gracePeriod=30 Apr 24 17:36:12.288752 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.288691 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kube-rbac-proxy" containerID="cri-o://da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d" gracePeriod=30 Apr 24 17:36:12.386181 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386137 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb"] Apr 24 17:36:12.386496 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386483 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" Apr 24 17:36:12.386554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386499 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" Apr 24 17:36:12.386554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386508 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kube-rbac-proxy" Apr 24 17:36:12.386554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386520 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kube-rbac-proxy" Apr 24 17:36:12.386554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386529 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="storage-initializer" Apr 24 17:36:12.386554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386536 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="storage-initializer" Apr 24 17:36:12.386720 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386589 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kserve-container" Apr 24 17:36:12.386720 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.386603 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0a18a3-fbc1-4e41-bb69-8e1781bb9c6c" containerName="kube-rbac-proxy" Apr 24 17:36:12.391038 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.391012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.393365 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.393338 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 17:36:12.393515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.393366 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 17:36:12.400554 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.400525 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb"] Apr 24 17:36:12.507403 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.507367 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68pf5\" (UniqueName: \"kubernetes.io/projected/1c4465ad-cd31-4e3e-befc-78c335d48df6-kube-api-access-68pf5\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.507403 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.507409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1c4465ad-cd31-4e3e-befc-78c335d48df6-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.507649 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.507440 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c4465ad-cd31-4e3e-befc-78c335d48df6-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.507649 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.507543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c4465ad-cd31-4e3e-befc-78c335d48df6-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.608330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.608219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c4465ad-cd31-4e3e-befc-78c335d48df6-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.608330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.608292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68pf5\" (UniqueName: \"kubernetes.io/projected/1c4465ad-cd31-4e3e-befc-78c335d48df6-kube-api-access-68pf5\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.608330 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.608331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1c4465ad-cd31-4e3e-befc-78c335d48df6-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.608668 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.608356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c4465ad-cd31-4e3e-befc-78c335d48df6-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.608739 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.608713 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c4465ad-cd31-4e3e-befc-78c335d48df6-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.609037 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.609019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1c4465ad-cd31-4e3e-befc-78c335d48df6-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.611006 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.610984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c4465ad-cd31-4e3e-befc-78c335d48df6-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.616407 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.616383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68pf5\" (UniqueName: \"kubernetes.io/projected/1c4465ad-cd31-4e3e-befc-78c335d48df6-kube-api-access-68pf5\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-w64lb\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.634339 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.634290 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9e3c445-9972-49f9-92af-255429d02bd7" containerID="da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d" exitCode=2 Apr 24 17:36:12.634503 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.634366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerDied","Data":"da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d"} Apr 24 17:36:12.702293 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.702255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:12.830223 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:12.830196 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb"] Apr 24 17:36:12.832846 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:36:12.832805 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4465ad_cd31_4e3e_befc_78c335d48df6.slice/crio-ad464a48dbe7945f508d88c77e09fb683ec2070aa41eda4873d694ce329cc8de WatchSource:0}: Error finding container ad464a48dbe7945f508d88c77e09fb683ec2070aa41eda4873d694ce329cc8de: Status 404 returned error can't find the container with id ad464a48dbe7945f508d88c77e09fb683ec2070aa41eda4873d694ce329cc8de Apr 24 17:36:13.638967 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:13.638923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerStarted","Data":"a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e"} Apr 24 17:36:13.638967 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:13.638965 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerStarted","Data":"ad464a48dbe7945f508d88c77e09fb683ec2070aa41eda4873d694ce329cc8de"} Apr 24 17:36:15.507799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:15.507758 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.56:8643/healthz\": dial tcp 10.134.0.56:8643: connect: connection refused" Apr 24 17:36:15.514041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:15.514000 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.56:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.56:8080: connect: connection refused" Apr 24 17:36:16.649517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:16.649423 2573 generic.go:358] "Generic (PLEG): container finished" podID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerID="a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e" exitCode=0 Apr 24 17:36:16.649517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:16.649501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerDied","Data":"a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e"} Apr 24 17:36:17.655965 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:17.655928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerStarted","Data":"21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b"} Apr 24 17:36:17.655965 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:17.655971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerStarted","Data":"937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340"} Apr 24 17:36:17.656424 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:17.656173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:17.676622 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:17.676569 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" podStartSLOduration=5.676552263 podStartE2EDuration="5.676552263s" podCreationTimestamp="2026-04-24 17:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:36:17.67591072 +0000 UTC m=+3435.940626155" watchObservedRunningTime="2026-04-24 17:36:17.676552263 +0000 UTC m=+3435.941267660" Apr 24 17:36:18.659287 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:18.659255 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:19.431941 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.431911 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:36:19.570181 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.570109 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448gd\" (UniqueName: \"kubernetes.io/projected/a9e3c445-9972-49f9-92af-255429d02bd7-kube-api-access-448gd\") pod \"a9e3c445-9972-49f9-92af-255429d02bd7\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " Apr 24 17:36:19.570181 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.570191 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e3c445-9972-49f9-92af-255429d02bd7-proxy-tls\") pod \"a9e3c445-9972-49f9-92af-255429d02bd7\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " Apr 24 17:36:19.570481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.570241 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9e3c445-9972-49f9-92af-255429d02bd7-kserve-provision-location\") pod \"a9e3c445-9972-49f9-92af-255429d02bd7\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " Apr 24 17:36:19.570481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.570283 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9e3c445-9972-49f9-92af-255429d02bd7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"a9e3c445-9972-49f9-92af-255429d02bd7\" (UID: \"a9e3c445-9972-49f9-92af-255429d02bd7\") " Apr 24 17:36:19.570581 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.570551 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e3c445-9972-49f9-92af-255429d02bd7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9e3c445-9972-49f9-92af-255429d02bd7" (UID: "a9e3c445-9972-49f9-92af-255429d02bd7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:36:19.570707 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.570685 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e3c445-9972-49f9-92af-255429d02bd7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "a9e3c445-9972-49f9-92af-255429d02bd7" (UID: "a9e3c445-9972-49f9-92af-255429d02bd7"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:36:19.572398 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.572378 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e3c445-9972-49f9-92af-255429d02bd7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a9e3c445-9972-49f9-92af-255429d02bd7" (UID: "a9e3c445-9972-49f9-92af-255429d02bd7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:36:19.572467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.572423 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e3c445-9972-49f9-92af-255429d02bd7-kube-api-access-448gd" (OuterVolumeSpecName: "kube-api-access-448gd") pod "a9e3c445-9972-49f9-92af-255429d02bd7" (UID: "a9e3c445-9972-49f9-92af-255429d02bd7"). InnerVolumeSpecName "kube-api-access-448gd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:36:19.663534 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.663501 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9e3c445-9972-49f9-92af-255429d02bd7" containerID="df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c" exitCode=0 Apr 24 17:36:19.663981 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.663586 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" Apr 24 17:36:19.663981 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.663585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerDied","Data":"df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c"} Apr 24 17:36:19.663981 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.663700 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f" event={"ID":"a9e3c445-9972-49f9-92af-255429d02bd7","Type":"ContainerDied","Data":"faf0db3d2b3baceef1f56180d11afefcdd6017bf1dbaae8d26e76e65a0ee341e"} Apr 24 17:36:19.663981 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.663717 2573 scope.go:117] "RemoveContainer" containerID="da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d" Apr 24 17:36:19.671008 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.670979 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9e3c445-9972-49f9-92af-255429d02bd7-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:36:19.671134 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.671011 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a9e3c445-9972-49f9-92af-255429d02bd7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:36:19.671134 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.671028 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-448gd\" (UniqueName: \"kubernetes.io/projected/a9e3c445-9972-49f9-92af-255429d02bd7-kube-api-access-448gd\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:36:19.671134 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.671042 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e3c445-9972-49f9-92af-255429d02bd7-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:36:19.671936 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.671913 2573 scope.go:117] "RemoveContainer" containerID="df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c" Apr 24 17:36:19.679891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.679862 2573 scope.go:117] "RemoveContainer" containerID="a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a" Apr 24 17:36:19.684681 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.684657 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f"] Apr 24 17:36:19.688767 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.688740 2573 scope.go:117] "RemoveContainer" containerID="da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d" Apr 24 17:36:19.689029 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.689009 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-ftm9f"] Apr 24 17:36:19.689095 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:36:19.689070 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d\": container with ID starting with da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d not found: ID does not exist" containerID="da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d" Apr 24 17:36:19.689134 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.689095 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d"} err="failed to get container status \"da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d\": rpc error: code = NotFound desc = could not find container \"da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d\": container with ID starting with da9503e02837ff594afbfbe16ed5abbb15b9d38f1d10294ef733842c4fd78f9d not found: ID does not exist" Apr 24 17:36:19.689134 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.689115 2573 scope.go:117] "RemoveContainer" containerID="df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c" Apr 24 17:36:19.689384 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:36:19.689363 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c\": container with ID starting with df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c not found: ID does not exist" containerID="df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c" Apr 24 17:36:19.689447 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.689391 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c"} err="failed to get container status \"df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c\": rpc error: code = NotFound desc = could not find container \"df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c\": container with ID starting with df882327350df7175b7edc0d969442d2f533bf91cd437284757ece36e7b1299c not found: ID does not exist" Apr 24 17:36:19.689447 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.689408 2573 scope.go:117] "RemoveContainer" containerID="a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a" Apr 24 17:36:19.689626 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:36:19.689611 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a\": container with ID starting with a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a not found: ID does not exist" containerID="a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a" Apr 24 17:36:19.689671 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:19.689631 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a"} err="failed to get container status \"a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a\": rpc error: code = NotFound desc = could not find container \"a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a\": container with ID starting with a9a0cdcad39e1ee95702cc6a4f9b022312ace19f198d634d05774500f05c003a not found: ID does not exist" Apr 24 17:36:20.217488 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:20.217450 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" path="/var/lib/kubelet/pods/a9e3c445-9972-49f9-92af-255429d02bd7/volumes" Apr 24 17:36:24.669451 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:24.669418 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:36:54.673536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:36:54.673505 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:37:02.467501 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.467466 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb"] Apr 24 17:37:02.467904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.467814 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kserve-container" containerID="cri-o://937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340" gracePeriod=30 Apr 24 17:37:02.467981 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.467860 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kube-rbac-proxy" containerID="cri-o://21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b" gracePeriod=30 Apr 24 17:37:02.562122 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562075 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd"] Apr 24 17:37:02.562424 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562411 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="storage-initializer" Apr 24 17:37:02.562471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562426 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="storage-initializer" Apr 24 17:37:02.562471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562445 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kube-rbac-proxy" Apr 24 17:37:02.562471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562451 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kube-rbac-proxy" Apr 24 17:37:02.562471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562458 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kserve-container" Apr 24 17:37:02.562471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562464 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kserve-container" Apr 24 17:37:02.562623 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562515 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kserve-container" Apr 24 17:37:02.562623 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.562523 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9e3c445-9972-49f9-92af-255429d02bd7" containerName="kube-rbac-proxy" Apr 24 17:37:02.565769 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.565747 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.567931 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.567911 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:37:02.568236 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.568222 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 24 17:37:02.576904 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.576874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd"] Apr 24 17:37:02.666860 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.666825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kube-api-access-cft4w\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.667044 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.666875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.667044 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.666971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.667044 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.667012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.768258 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.768219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.768469 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.768267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.768469 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.768297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kube-api-access-cft4w\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.768469 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.768354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.768782 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.768758 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.768994 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.768971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.770953 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.770936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.776356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.776329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kube-api-access-cft4w\") pod \"isvc-xgboost-runtime-predictor-779db84d9-l2lsd\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:02.789132 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.789103 2573 generic.go:358] "Generic (PLEG): container finished" podID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerID="21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b" exitCode=2 Apr 24 17:37:02.789274 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.789176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerDied","Data":"21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b"} Apr 24 17:37:02.876913 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:02.876870 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:03.000061 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:03.000034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd"] Apr 24 17:37:03.002495 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:37:03.002454 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f9be04b_4cf3_405b_a269_cb0c52abbb2a.slice/crio-f91a8383f4a650ceb80c159cb162b06229be49ad97316985e865eae6fdac00bc WatchSource:0}: Error finding container f91a8383f4a650ceb80c159cb162b06229be49ad97316985e865eae6fdac00bc: Status 404 returned error can't find the container with id f91a8383f4a650ceb80c159cb162b06229be49ad97316985e865eae6fdac00bc Apr 24 17:37:03.792876 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:03.792835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerStarted","Data":"01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d"} Apr 24 17:37:03.792876 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:03.792875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerStarted","Data":"f91a8383f4a650ceb80c159cb162b06229be49ad97316985e865eae6fdac00bc"} Apr 24 17:37:04.665322 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:04.665272 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.57:8643/healthz\": dial tcp 10.134.0.57:8643: connect: connection refused" Apr 24 17:37:06.802739 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:06.802702 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerID="01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d" exitCode=0 Apr 24 17:37:06.803139 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:06.802774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerDied","Data":"01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d"} Apr 24 17:37:07.807671 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:07.807636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerStarted","Data":"24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3"} Apr 24 17:37:07.807671 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:07.807676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerStarted","Data":"0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8"} Apr 24 17:37:07.808090 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:07.807913 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:07.830126 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:07.830057 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podStartSLOduration=5.830039483 podStartE2EDuration="5.830039483s" podCreationTimestamp="2026-04-24 17:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:37:07.82757985 +0000 UTC m=+3486.092295248" watchObservedRunningTime="2026-04-24 17:37:07.830039483 +0000 UTC m=+3486.094754880" Apr 24 17:37:08.811347 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:08.811293 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:08.812453 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:08.812425 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:37:09.505948 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.505924 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:37:09.525224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.524936 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c4465ad-cd31-4e3e-befc-78c335d48df6-proxy-tls\") pod \"1c4465ad-cd31-4e3e-befc-78c335d48df6\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " Apr 24 17:37:09.525224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.525003 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68pf5\" (UniqueName: \"kubernetes.io/projected/1c4465ad-cd31-4e3e-befc-78c335d48df6-kube-api-access-68pf5\") pod \"1c4465ad-cd31-4e3e-befc-78c335d48df6\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " Apr 24 17:37:09.525224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.525042 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c4465ad-cd31-4e3e-befc-78c335d48df6-kserve-provision-location\") pod \"1c4465ad-cd31-4e3e-befc-78c335d48df6\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " Apr 24 17:37:09.525224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.525113 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1c4465ad-cd31-4e3e-befc-78c335d48df6-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"1c4465ad-cd31-4e3e-befc-78c335d48df6\" (UID: \"1c4465ad-cd31-4e3e-befc-78c335d48df6\") " Apr 24 17:37:09.525635 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.525608 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c4465ad-cd31-4e3e-befc-78c335d48df6-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "1c4465ad-cd31-4e3e-befc-78c335d48df6" (UID: "1c4465ad-cd31-4e3e-befc-78c335d48df6"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:37:09.526759 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.526697 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4465ad-cd31-4e3e-befc-78c335d48df6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1c4465ad-cd31-4e3e-befc-78c335d48df6" (UID: "1c4465ad-cd31-4e3e-befc-78c335d48df6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:37:09.528008 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.527981 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4465ad-cd31-4e3e-befc-78c335d48df6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1c4465ad-cd31-4e3e-befc-78c335d48df6" (UID: "1c4465ad-cd31-4e3e-befc-78c335d48df6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:37:09.528388 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.528364 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4465ad-cd31-4e3e-befc-78c335d48df6-kube-api-access-68pf5" (OuterVolumeSpecName: "kube-api-access-68pf5") pod "1c4465ad-cd31-4e3e-befc-78c335d48df6" (UID: "1c4465ad-cd31-4e3e-befc-78c335d48df6"). InnerVolumeSpecName "kube-api-access-68pf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:37:09.626011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.625969 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-68pf5\" (UniqueName: \"kubernetes.io/projected/1c4465ad-cd31-4e3e-befc-78c335d48df6-kube-api-access-68pf5\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:37:09.626011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.626008 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c4465ad-cd31-4e3e-befc-78c335d48df6-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:37:09.626011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.626021 2573 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1c4465ad-cd31-4e3e-befc-78c335d48df6-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:37:09.626269 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.626032 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1c4465ad-cd31-4e3e-befc-78c335d48df6-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:37:09.815227 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.815187 2573 generic.go:358] "Generic (PLEG): container finished" podID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerID="937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340" exitCode=0 Apr 24 17:37:09.815796 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.815228 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerDied","Data":"937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340"} Apr 24 17:37:09.815796 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.815277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" event={"ID":"1c4465ad-cd31-4e3e-befc-78c335d48df6","Type":"ContainerDied","Data":"ad464a48dbe7945f508d88c77e09fb683ec2070aa41eda4873d694ce329cc8de"} Apr 24 17:37:09.815796 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.815295 2573 scope.go:117] "RemoveContainer" containerID="21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b" Apr 24 17:37:09.815796 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.815301 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb" Apr 24 17:37:09.815796 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.815645 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:37:09.823418 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.823395 2573 scope.go:117] "RemoveContainer" containerID="937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340" Apr 24 17:37:09.831024 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.831004 2573 scope.go:117] "RemoveContainer" containerID="a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e" Apr 24 17:37:09.836192 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.836161 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb"] Apr 24 17:37:09.838836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.838812 2573 scope.go:117] "RemoveContainer" containerID="21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b" Apr 24 17:37:09.839189 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:37:09.839162 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b\": container with ID starting with 21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b not found: ID does not exist" containerID="21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b" Apr 24 17:37:09.839275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.839200 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b"} err="failed to get container status \"21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b\": rpc error: code = NotFound desc = could not find container \"21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b\": container with ID starting with 21f46f303fddd42a7d1567c3f473e9e527cd6039b605955e0acdbad525d1b95b not found: ID does not exist" Apr 24 17:37:09.839275 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.839225 2573 scope.go:117] "RemoveContainer" containerID="937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340" Apr 24 17:37:09.839519 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:37:09.839495 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340\": container with ID starting with 937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340 not found: ID does not exist" containerID="937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340" Apr 24 17:37:09.839583 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.839525 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340"} err="failed to get container status \"937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340\": rpc error: code = NotFound desc = could not find container \"937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340\": container with ID starting with 937099d20c3c18eed585c1e7a4c7d1c7361ab806c05a683486e92832bfc37340 not found: ID does not exist" Apr 24 17:37:09.839583 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.839540 2573 scope.go:117] "RemoveContainer" containerID="a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e" Apr 24 17:37:09.839781 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:37:09.839765 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e\": container with ID starting with a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e not found: ID does not exist" containerID="a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e" Apr 24 17:37:09.839836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.839783 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e"} err="failed to get container status \"a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e\": rpc error: code = NotFound desc = could not find container \"a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e\": container with ID starting with a61e4d889a743461a463779e92b8d3e931ffecce7ca7531999c43d319bd6c40e not found: ID does not exist" Apr 24 17:37:09.840241 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:09.840219 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-w64lb"] Apr 24 17:37:10.216978 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:10.216899 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" path="/var/lib/kubelet/pods/1c4465ad-cd31-4e3e-befc-78c335d48df6/volumes" Apr 24 17:37:14.821353 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:14.821294 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:37:14.821850 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:14.821821 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:37:24.822428 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:24.822335 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:37:34.822433 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:34.822388 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:37:44.822193 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:44.822148 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:37:54.822037 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:37:54.821996 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:38:04.822475 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:04.822438 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:38:12.678660 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.678624 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd"] Apr 24 17:38:12.679125 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.679020 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" containerID="cri-o://0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8" gracePeriod=30 Apr 24 17:38:12.679203 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.679171 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kube-rbac-proxy" containerID="cri-o://24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3" gracePeriod=30 Apr 24 17:38:12.753042 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753007 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89"] Apr 24 17:38:12.753341 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753324 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kserve-container" Apr 24 17:38:12.753440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753344 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kserve-container" Apr 24 17:38:12.753440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753362 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kube-rbac-proxy" Apr 24 17:38:12.753440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753368 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kube-rbac-proxy" Apr 24 17:38:12.753440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753382 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="storage-initializer" Apr 24 17:38:12.753440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753388 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="storage-initializer" Apr 24 17:38:12.753440 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753436 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kube-rbac-proxy" Apr 24 17:38:12.753660 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.753446 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c4465ad-cd31-4e3e-befc-78c335d48df6" containerName="kserve-container" Apr 24 17:38:12.757912 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.757887 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.760142 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.760115 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 17:38:12.760257 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.760125 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 24 17:38:12.767814 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.767784 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89"] Apr 24 17:38:12.843068 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.843031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.843247 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.843083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2343a3b7-85bf-45ae-9376-d4d7b23c3017-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.843247 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.843199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4g2v\" (UniqueName: \"kubernetes.io/projected/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kube-api-access-x4g2v\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.843247 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.843238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2343a3b7-85bf-45ae-9376-d4d7b23c3017-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.944323 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.944212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.944323 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.944268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2343a3b7-85bf-45ae-9376-d4d7b23c3017-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.944546 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.944334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4g2v\" (UniqueName: \"kubernetes.io/projected/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kube-api-access-x4g2v\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.944546 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.944358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2343a3b7-85bf-45ae-9376-d4d7b23c3017-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.944667 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.944646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.944942 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.944924 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2343a3b7-85bf-45ae-9376-d4d7b23c3017-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.946934 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.946915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2343a3b7-85bf-45ae-9376-d4d7b23c3017-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.952439 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.952410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4g2v\" (UniqueName: \"kubernetes.io/projected/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kube-api-access-x4g2v\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:12.999415 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.999381 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerID="24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3" exitCode=2 Apr 24 17:38:12.999585 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:12.999422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerDied","Data":"24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3"} Apr 24 17:38:13.067696 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:13.067653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:13.204782 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:13.204733 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89"] Apr 24 17:38:13.205014 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:38:13.204988 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2343a3b7_85bf_45ae_9376_d4d7b23c3017.slice/crio-a254a9687882a49fac79850064bb1890c48bb107ac7abf5a427351964f69c817 WatchSource:0}: Error finding container a254a9687882a49fac79850064bb1890c48bb107ac7abf5a427351964f69c817: Status 404 returned error can't find the container with id a254a9687882a49fac79850064bb1890c48bb107ac7abf5a427351964f69c817 Apr 24 17:38:14.003186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:14.003149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerStarted","Data":"b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2"} Apr 24 17:38:14.003186 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:14.003189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerStarted","Data":"a254a9687882a49fac79850064bb1890c48bb107ac7abf5a427351964f69c817"} Apr 24 17:38:14.816767 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:14.816716 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.58:8643/healthz\": dial tcp 10.134.0.58:8643: connect: connection refused" Apr 24 17:38:14.821885 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:14.821854 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 24 17:38:16.516201 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.516175 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:38:16.578598 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.578504 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kserve-provision-location\") pod \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " Apr 24 17:38:16.578598 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.578593 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kube-api-access-cft4w\") pod \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " Apr 24 17:38:16.578836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.578637 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-proxy-tls\") pod \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " Apr 24 17:38:16.578836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.578688 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\" (UID: \"8f9be04b-4cf3-405b-a269-cb0c52abbb2a\") " Apr 24 17:38:16.578930 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.578883 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f9be04b-4cf3-405b-a269-cb0c52abbb2a" (UID: "8f9be04b-4cf3-405b-a269-cb0c52abbb2a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:38:16.579148 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.579121 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "8f9be04b-4cf3-405b-a269-cb0c52abbb2a" (UID: "8f9be04b-4cf3-405b-a269-cb0c52abbb2a"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:38:16.580940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.580913 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8f9be04b-4cf3-405b-a269-cb0c52abbb2a" (UID: "8f9be04b-4cf3-405b-a269-cb0c52abbb2a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:38:16.581105 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.581091 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kube-api-access-cft4w" (OuterVolumeSpecName: "kube-api-access-cft4w") pod "8f9be04b-4cf3-405b-a269-cb0c52abbb2a" (UID: "8f9be04b-4cf3-405b-a269-cb0c52abbb2a"). InnerVolumeSpecName "kube-api-access-cft4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:38:16.679677 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.679638 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:38:16.679677 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.679672 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:38:16.679677 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.679682 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:38:16.679920 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:16.679694 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/8f9be04b-4cf3-405b-a269-cb0c52abbb2a-kube-api-access-cft4w\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:38:17.013280 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.013242 2573 generic.go:358] "Generic (PLEG): container finished" podID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerID="b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2" exitCode=0 Apr 24 17:38:17.013500 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.013341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerDied","Data":"b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2"} Apr 24 17:38:17.015202 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.015172 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerID="0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8" exitCode=0 Apr 24 17:38:17.015348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.015249 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" Apr 24 17:38:17.015348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.015245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerDied","Data":"0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8"} Apr 24 17:38:17.015472 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.015363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd" event={"ID":"8f9be04b-4cf3-405b-a269-cb0c52abbb2a","Type":"ContainerDied","Data":"f91a8383f4a650ceb80c159cb162b06229be49ad97316985e865eae6fdac00bc"} Apr 24 17:38:17.015472 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.015394 2573 scope.go:117] "RemoveContainer" containerID="24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3" Apr 24 17:38:17.023723 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.023690 2573 scope.go:117] "RemoveContainer" containerID="0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8" Apr 24 17:38:17.030738 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.030714 2573 scope.go:117] "RemoveContainer" containerID="01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d" Apr 24 17:38:17.038951 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.038924 2573 scope.go:117] "RemoveContainer" containerID="24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3" Apr 24 17:38:17.039228 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:38:17.039209 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3\": container with ID starting with 24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3 not found: ID does not exist" containerID="24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3" Apr 24 17:38:17.039335 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.039240 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3"} err="failed to get container status \"24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3\": rpc error: code = NotFound desc = could not find container \"24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3\": container with ID starting with 24d22f19f919a2dbae1902f0e3142e20f88a49a9760eb83ddc5b186430cde9d3 not found: ID does not exist" Apr 24 17:38:17.039335 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.039265 2573 scope.go:117] "RemoveContainer" containerID="0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8" Apr 24 17:38:17.039540 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:38:17.039516 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8\": container with ID starting with 0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8 not found: ID does not exist" containerID="0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8" Apr 24 17:38:17.039581 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.039547 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8"} err="failed to get container status \"0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8\": rpc error: code = NotFound desc = could not find container \"0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8\": container with ID starting with 0bc283e22a0bc4d0fcc051148af9434bdec35155605310e3dfc70a1fbbb806c8 not found: ID does not exist" Apr 24 17:38:17.039581 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.039561 2573 scope.go:117] "RemoveContainer" containerID="01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d" Apr 24 17:38:17.039790 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:38:17.039771 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d\": container with ID starting with 01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d not found: ID does not exist" containerID="01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d" Apr 24 17:38:17.039841 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.039797 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d"} err="failed to get container status \"01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d\": rpc error: code = NotFound desc = could not find container \"01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d\": container with ID starting with 01af4d7eed4fb53ef392bde42cfa32f682eec3c595043ab7019197be0345943d not found: ID does not exist" Apr 24 17:38:17.053020 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.052993 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd"] Apr 24 17:38:17.057736 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:17.057712 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-l2lsd"] Apr 24 17:38:18.019900 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:18.019864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerStarted","Data":"d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac"} Apr 24 17:38:18.020322 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:18.019912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerStarted","Data":"f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a"} Apr 24 17:38:18.020322 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:18.020245 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:18.020322 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:18.020287 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:18.038683 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:18.038632 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podStartSLOduration=6.038617302 podStartE2EDuration="6.038617302s" podCreationTimestamp="2026-04-24 17:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:38:18.037565749 +0000 UTC m=+3556.302281146" watchObservedRunningTime="2026-04-24 17:38:18.038617302 +0000 UTC m=+3556.303332699" Apr 24 17:38:18.217371 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:18.217335 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" path="/var/lib/kubelet/pods/8f9be04b-4cf3-405b-a269-cb0c52abbb2a/volumes" Apr 24 17:38:24.028414 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:24.028378 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:38:54.032595 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:38:54.032498 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 17:39:04.031412 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:04.031374 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:39:07.762178 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:07.762146 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:39:07.764482 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:07.764457 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:39:12.835413 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.835380 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89"] Apr 24 17:39:12.835883 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.835789 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kserve-container" containerID="cri-o://f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a" gracePeriod=30 Apr 24 17:39:12.835883 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.835813 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kube-rbac-proxy" containerID="cri-o://d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac" gracePeriod=30 Apr 24 17:39:12.922980 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.922944 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t"] Apr 24 17:39:12.923260 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923247 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="storage-initializer" Apr 24 17:39:12.923340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923262 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="storage-initializer" Apr 24 17:39:12.923340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923269 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" Apr 24 17:39:12.923340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923275 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" Apr 24 17:39:12.923340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923292 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kube-rbac-proxy" Apr 24 17:39:12.923340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923298 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kube-rbac-proxy" Apr 24 17:39:12.923504 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923365 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kube-rbac-proxy" Apr 24 17:39:12.923504 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.923375 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f9be04b-4cf3-405b-a269-cb0c52abbb2a" containerName="kserve-container" Apr 24 17:39:12.927751 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.927731 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:12.930101 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.930066 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 24 17:39:12.930225 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.930138 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 17:39:12.937898 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:12.937866 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t"] Apr 24 17:39:13.048419 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.048384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/668adc89-9b4c-44bf-90a2-afc4c038cd44-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.048586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.048447 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/668adc89-9b4c-44bf-90a2-afc4c038cd44-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.048586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.048508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvdx\" (UniqueName: \"kubernetes.io/projected/668adc89-9b4c-44bf-90a2-afc4c038cd44-kube-api-access-dpvdx\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.048664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.048588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/668adc89-9b4c-44bf-90a2-afc4c038cd44-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.149486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.149417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/668adc89-9b4c-44bf-90a2-afc4c038cd44-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.149486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.149451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/668adc89-9b4c-44bf-90a2-afc4c038cd44-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.149674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.149487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/668adc89-9b4c-44bf-90a2-afc4c038cd44-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.149674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.149517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvdx\" (UniqueName: \"kubernetes.io/projected/668adc89-9b4c-44bf-90a2-afc4c038cd44-kube-api-access-dpvdx\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.149889 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.149870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/668adc89-9b4c-44bf-90a2-afc4c038cd44-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.150266 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.150239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/668adc89-9b4c-44bf-90a2-afc4c038cd44-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.152255 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.152235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/668adc89-9b4c-44bf-90a2-afc4c038cd44-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.158370 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.158343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvdx\" (UniqueName: \"kubernetes.io/projected/668adc89-9b4c-44bf-90a2-afc4c038cd44-kube-api-access-dpvdx\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.174058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.174020 2573 generic.go:358] "Generic (PLEG): container finished" podID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerID="d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac" exitCode=2 Apr 24 17:39:13.174224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.174064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerDied","Data":"d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac"} Apr 24 17:39:13.238801 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.238749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:13.362438 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.362397 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t"] Apr 24 17:39:13.365399 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:39:13.365374 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod668adc89_9b4c_44bf_90a2_afc4c038cd44.slice/crio-1ff7b1a7dbe0d5a0c4c8535b309ba7f4a575d38047e8ef1edfdf007fe8e9de1d WatchSource:0}: Error finding container 1ff7b1a7dbe0d5a0c4c8535b309ba7f4a575d38047e8ef1edfdf007fe8e9de1d: Status 404 returned error can't find the container with id 1ff7b1a7dbe0d5a0c4c8535b309ba7f4a575d38047e8ef1edfdf007fe8e9de1d Apr 24 17:39:13.367215 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:13.367199 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:39:14.024711 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:14.024663 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.59:8643/healthz\": dial tcp 10.134.0.59:8643: connect: connection refused" Apr 24 17:39:14.178102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:14.178063 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerStarted","Data":"e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d"} Apr 24 17:39:14.178274 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:14.178108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerStarted","Data":"1ff7b1a7dbe0d5a0c4c8535b309ba7f4a575d38047e8ef1edfdf007fe8e9de1d"} Apr 24 17:39:15.070553 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:15.070501 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.59:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 17:39:17.186976 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:17.186945 2573 generic.go:358] "Generic (PLEG): container finished" podID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerID="e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d" exitCode=0 Apr 24 17:39:17.187380 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:17.186991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerDied","Data":"e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d"} Apr 24 17:39:18.192715 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:18.192665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerStarted","Data":"85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e"} Apr 24 17:39:18.192715 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:18.192708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerStarted","Data":"045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b"} Apr 24 17:39:18.193249 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:18.192960 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:18.212381 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:18.212322 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podStartSLOduration=6.212294886 podStartE2EDuration="6.212294886s" podCreationTimestamp="2026-04-24 17:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:39:18.210544218 +0000 UTC m=+3616.475259618" watchObservedRunningTime="2026-04-24 17:39:18.212294886 +0000 UTC m=+3616.477010283" Apr 24 17:39:19.023958 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:19.023910 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.59:8643/healthz\": dial tcp 10.134.0.59:8643: connect: connection refused" Apr 24 17:39:19.195531 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:19.195498 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:19.196729 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:19.196698 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:39:20.198456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.198420 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:39:20.674439 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.674410 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:39:20.811799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.811706 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4g2v\" (UniqueName: \"kubernetes.io/projected/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kube-api-access-x4g2v\") pod \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " Apr 24 17:39:20.811799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.811754 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kserve-provision-location\") pod \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " Apr 24 17:39:20.811799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.811786 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2343a3b7-85bf-45ae-9376-d4d7b23c3017-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " Apr 24 17:39:20.812027 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.811807 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2343a3b7-85bf-45ae-9376-d4d7b23c3017-proxy-tls\") pod \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\" (UID: \"2343a3b7-85bf-45ae-9376-d4d7b23c3017\") " Apr 24 17:39:20.812100 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.812080 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2343a3b7-85bf-45ae-9376-d4d7b23c3017" (UID: "2343a3b7-85bf-45ae-9376-d4d7b23c3017"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:39:20.812224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.812197 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2343a3b7-85bf-45ae-9376-d4d7b23c3017-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "2343a3b7-85bf-45ae-9376-d4d7b23c3017" (UID: "2343a3b7-85bf-45ae-9376-d4d7b23c3017"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:39:20.813957 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.813932 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kube-api-access-x4g2v" (OuterVolumeSpecName: "kube-api-access-x4g2v") pod "2343a3b7-85bf-45ae-9376-d4d7b23c3017" (UID: "2343a3b7-85bf-45ae-9376-d4d7b23c3017"). InnerVolumeSpecName "kube-api-access-x4g2v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:39:20.814068 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.813997 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2343a3b7-85bf-45ae-9376-d4d7b23c3017-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2343a3b7-85bf-45ae-9376-d4d7b23c3017" (UID: "2343a3b7-85bf-45ae-9376-d4d7b23c3017"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:39:20.912619 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.912574 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4g2v\" (UniqueName: \"kubernetes.io/projected/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kube-api-access-x4g2v\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:39:20.912619 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.912611 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2343a3b7-85bf-45ae-9376-d4d7b23c3017-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:39:20.912619 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.912621 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2343a3b7-85bf-45ae-9376-d4d7b23c3017-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:39:20.912619 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:20.912631 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2343a3b7-85bf-45ae-9376-d4d7b23c3017-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:39:21.202503 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.202411 2573 generic.go:358] "Generic (PLEG): container finished" podID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerID="f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a" exitCode=0 Apr 24 17:39:21.202933 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.202500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerDied","Data":"f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a"} Apr 24 17:39:21.202933 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.202513 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" Apr 24 17:39:21.202933 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.202543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89" event={"ID":"2343a3b7-85bf-45ae-9376-d4d7b23c3017","Type":"ContainerDied","Data":"a254a9687882a49fac79850064bb1890c48bb107ac7abf5a427351964f69c817"} Apr 24 17:39:21.202933 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.202558 2573 scope.go:117] "RemoveContainer" containerID="d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac" Apr 24 17:39:21.210737 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.210717 2573 scope.go:117] "RemoveContainer" containerID="f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a" Apr 24 17:39:21.217817 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.217797 2573 scope.go:117] "RemoveContainer" containerID="b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2" Apr 24 17:39:21.223896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.223869 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89"] Apr 24 17:39:21.225917 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.225885 2573 scope.go:117] "RemoveContainer" containerID="d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac" Apr 24 17:39:21.226606 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:39:21.226260 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac\": container with ID starting with d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac not found: ID does not exist" containerID="d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac" Apr 24 17:39:21.226606 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.226299 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac"} err="failed to get container status \"d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac\": rpc error: code = NotFound desc = could not find container \"d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac\": container with ID starting with d431c8b8ec4bd66eaf95fa6e8a415a00fa0b5851ccbee8aed538781b64893aac not found: ID does not exist" Apr 24 17:39:21.226606 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.226482 2573 scope.go:117] "RemoveContainer" containerID="f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a" Apr 24 17:39:21.226835 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:39:21.226761 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a\": container with ID starting with f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a not found: ID does not exist" containerID="f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a" Apr 24 17:39:21.226835 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.226794 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a"} err="failed to get container status \"f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a\": rpc error: code = NotFound desc = could not find container \"f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a\": container with ID starting with f80505bc54d23dd51be420b512d2bcd0bc30885b116364e3d59d02b70102ec5a not found: ID does not exist" Apr 24 17:39:21.226835 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.226813 2573 scope.go:117] "RemoveContainer" containerID="b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2" Apr 24 17:39:21.227071 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:39:21.227053 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2\": container with ID starting with b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2 not found: ID does not exist" containerID="b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2" Apr 24 17:39:21.227142 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.227074 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2"} err="failed to get container status \"b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2\": rpc error: code = NotFound desc = could not find container \"b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2\": container with ID starting with b8a4d6ea318af85ec74e22f856454261a29ba8c56f58b8413b62000b349c1ee2 not found: ID does not exist" Apr 24 17:39:21.227572 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:21.227550 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qsr89"] Apr 24 17:39:22.215942 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:22.215899 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" path="/var/lib/kubelet/pods/2343a3b7-85bf-45ae-9376-d4d7b23c3017/volumes" Apr 24 17:39:25.203698 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:25.203668 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:39:25.204341 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:25.204293 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:39:35.205232 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:35.205191 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:39:45.205182 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:45.205137 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:39:55.205049 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:39:55.205008 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:40:05.204575 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:05.204531 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:40:15.205403 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:15.205375 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:40:23.022934 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.022841 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t"] Apr 24 17:40:23.023407 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.023201 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" containerID="cri-o://045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b" gracePeriod=30 Apr 24 17:40:23.023407 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.023213 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kube-rbac-proxy" containerID="cri-o://85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e" gracePeriod=30 Apr 24 17:40:23.100109 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100069 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx"] Apr 24 17:40:23.100411 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100396 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kube-rbac-proxy" Apr 24 17:40:23.100411 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100412 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kube-rbac-proxy" Apr 24 17:40:23.100515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100428 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="storage-initializer" Apr 24 17:40:23.100515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100437 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="storage-initializer" Apr 24 17:40:23.100515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100451 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kserve-container" Apr 24 17:40:23.100515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100457 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kserve-container" Apr 24 17:40:23.100515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100501 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kserve-container" Apr 24 17:40:23.100515 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.100508 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2343a3b7-85bf-45ae-9376-d4d7b23c3017" containerName="kube-rbac-proxy" Apr 24 17:40:23.103840 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.103824 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.107359 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.107331 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 24 17:40:23.107611 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.107592 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 24 17:40:23.107876 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.107856 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 17:40:23.118072 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.118048 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx"] Apr 24 17:40:23.212611 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.212566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d987bffd-5418-4a39-9ada-511d8423a40e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.212611 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.212614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.212855 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.212661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d987bffd-5418-4a39-9ada-511d8423a40e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.212855 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.212693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgfb\" (UniqueName: \"kubernetes.io/projected/d987bffd-5418-4a39-9ada-511d8423a40e-kube-api-access-6hgfb\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.313339 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.313209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d987bffd-5418-4a39-9ada-511d8423a40e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.313339 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.313263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.313339 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.313331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d987bffd-5418-4a39-9ada-511d8423a40e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.313601 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.313375 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgfb\" (UniqueName: \"kubernetes.io/projected/d987bffd-5418-4a39-9ada-511d8423a40e-kube-api-access-6hgfb\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.313601 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:40:23.313422 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 17:40:23.313601 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:40:23.313494 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls podName:d987bffd-5418-4a39-9ada-511d8423a40e nodeName:}" failed. No retries permitted until 2026-04-24 17:40:23.813475866 +0000 UTC m=+3682.078191253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls") pod "isvc-sklearn-s3-predictor-857d79f78-fvmqx" (UID: "d987bffd-5418-4a39-9ada-511d8423a40e") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 17:40:23.313808 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.313781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d987bffd-5418-4a39-9ada-511d8423a40e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.314026 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.314011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d987bffd-5418-4a39-9ada-511d8423a40e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.322253 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.322226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgfb\" (UniqueName: \"kubernetes.io/projected/d987bffd-5418-4a39-9ada-511d8423a40e-kube-api-access-6hgfb\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.385625 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.385588 2573 generic.go:358] "Generic (PLEG): container finished" podID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerID="85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e" exitCode=2 Apr 24 17:40:23.385793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.385653 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerDied","Data":"85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e"} Apr 24 17:40:23.817086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.817039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:23.819629 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:23.819598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-857d79f78-fvmqx\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:24.013837 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:24.013798 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:24.143669 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:24.143641 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx"] Apr 24 17:40:24.144624 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:40:24.144600 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd987bffd_5418_4a39_9ada_511d8423a40e.slice/crio-d8da63ff693c7b7d9f814ab7646947395b447df4a24b698d172843aaee945a5e WatchSource:0}: Error finding container d8da63ff693c7b7d9f814ab7646947395b447df4a24b698d172843aaee945a5e: Status 404 returned error can't find the container with id d8da63ff693c7b7d9f814ab7646947395b447df4a24b698d172843aaee945a5e Apr 24 17:40:24.390793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:24.390692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerStarted","Data":"edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3"} Apr 24 17:40:24.390793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:24.390741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerStarted","Data":"d8da63ff693c7b7d9f814ab7646947395b447df4a24b698d172843aaee945a5e"} Apr 24 17:40:25.199247 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:25.199203 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.60:8643/healthz\": dial tcp 10.134.0.60:8643: connect: connection refused" Apr 24 17:40:25.204624 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:25.204594 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.60:8080: connect: connection refused" Apr 24 17:40:25.394908 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:25.394818 2573 generic.go:358] "Generic (PLEG): container finished" podID="d987bffd-5418-4a39-9ada-511d8423a40e" containerID="edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3" exitCode=0 Apr 24 17:40:25.394908 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:25.394867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerDied","Data":"edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3"} Apr 24 17:40:26.400179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.400139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerStarted","Data":"0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1"} Apr 24 17:40:26.400179 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.400179 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerStarted","Data":"8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4"} Apr 24 17:40:26.400674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.400338 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:26.400674 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.400472 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:26.401787 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.401759 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:40:26.418005 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.417959 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podStartSLOduration=3.417944818 podStartE2EDuration="3.417944818s" podCreationTimestamp="2026-04-24 17:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:40:26.417093278 +0000 UTC m=+3684.681808680" watchObservedRunningTime="2026-04-24 17:40:26.417944818 +0000 UTC m=+3684.682660215" Apr 24 17:40:26.873378 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.873354 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:40:26.941863 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.941832 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/668adc89-9b4c-44bf-90a2-afc4c038cd44-kserve-provision-location\") pod \"668adc89-9b4c-44bf-90a2-afc4c038cd44\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " Apr 24 17:40:26.942053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.941889 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/668adc89-9b4c-44bf-90a2-afc4c038cd44-proxy-tls\") pod \"668adc89-9b4c-44bf-90a2-afc4c038cd44\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " Apr 24 17:40:26.942053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.941917 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpvdx\" (UniqueName: \"kubernetes.io/projected/668adc89-9b4c-44bf-90a2-afc4c038cd44-kube-api-access-dpvdx\") pod \"668adc89-9b4c-44bf-90a2-afc4c038cd44\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " Apr 24 17:40:26.942053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.941982 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/668adc89-9b4c-44bf-90a2-afc4c038cd44-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"668adc89-9b4c-44bf-90a2-afc4c038cd44\" (UID: \"668adc89-9b4c-44bf-90a2-afc4c038cd44\") " Apr 24 17:40:26.942296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.942262 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668adc89-9b4c-44bf-90a2-afc4c038cd44-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "668adc89-9b4c-44bf-90a2-afc4c038cd44" (UID: "668adc89-9b4c-44bf-90a2-afc4c038cd44"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:40:26.942482 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.942380 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/668adc89-9b4c-44bf-90a2-afc4c038cd44-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "668adc89-9b4c-44bf-90a2-afc4c038cd44" (UID: "668adc89-9b4c-44bf-90a2-afc4c038cd44"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:40:26.944132 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.944109 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668adc89-9b4c-44bf-90a2-afc4c038cd44-kube-api-access-dpvdx" (OuterVolumeSpecName: "kube-api-access-dpvdx") pod "668adc89-9b4c-44bf-90a2-afc4c038cd44" (UID: "668adc89-9b4c-44bf-90a2-afc4c038cd44"). InnerVolumeSpecName "kube-api-access-dpvdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:40:26.944215 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:26.944160 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/668adc89-9b4c-44bf-90a2-afc4c038cd44-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "668adc89-9b4c-44bf-90a2-afc4c038cd44" (UID: "668adc89-9b4c-44bf-90a2-afc4c038cd44"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:40:27.042481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.042439 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/668adc89-9b4c-44bf-90a2-afc4c038cd44-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:40:27.042481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.042472 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/668adc89-9b4c-44bf-90a2-afc4c038cd44-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:40:27.042481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.042482 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/668adc89-9b4c-44bf-90a2-afc4c038cd44-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:40:27.042481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.042491 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpvdx\" (UniqueName: \"kubernetes.io/projected/668adc89-9b4c-44bf-90a2-afc4c038cd44-kube-api-access-dpvdx\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:40:27.407191 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.407095 2573 generic.go:358] "Generic (PLEG): container finished" podID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerID="045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b" exitCode=0 Apr 24 17:40:27.407191 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.407178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerDied","Data":"045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b"} Apr 24 17:40:27.407692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.407213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" event={"ID":"668adc89-9b4c-44bf-90a2-afc4c038cd44","Type":"ContainerDied","Data":"1ff7b1a7dbe0d5a0c4c8535b309ba7f4a575d38047e8ef1edfdf007fe8e9de1d"} Apr 24 17:40:27.407692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.407184 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t" Apr 24 17:40:27.407692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.407232 2573 scope.go:117] "RemoveContainer" containerID="85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e" Apr 24 17:40:27.407991 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.407963 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:40:27.415606 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.415503 2573 scope.go:117] "RemoveContainer" containerID="045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b" Apr 24 17:40:27.422889 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.422870 2573 scope.go:117] "RemoveContainer" containerID="e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d" Apr 24 17:40:27.430477 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.430453 2573 scope.go:117] "RemoveContainer" containerID="85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e" Apr 24 17:40:27.430761 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:40:27.430739 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e\": container with ID starting with 85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e not found: ID does not exist" containerID="85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e" Apr 24 17:40:27.430836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.430775 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e"} err="failed to get container status \"85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e\": rpc error: code = NotFound desc = could not find container \"85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e\": container with ID starting with 85003408fc53ae3f6df7501d937509a158f67d34da126a2c2f35bcd7a808e99e not found: ID does not exist" Apr 24 17:40:27.430836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.430801 2573 scope.go:117] "RemoveContainer" containerID="045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b" Apr 24 17:40:27.431041 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:40:27.431027 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b\": container with ID starting with 045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b not found: ID does not exist" containerID="045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b" Apr 24 17:40:27.431102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.431048 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b"} err="failed to get container status \"045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b\": rpc error: code = NotFound desc = could not find container \"045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b\": container with ID starting with 045f15dd5f2f6cc41ba5df2a05b1da5ad451d3bc0cff817a5b1c0b6b8a3cf23b not found: ID does not exist" Apr 24 17:40:27.431102 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.431067 2573 scope.go:117] "RemoveContainer" containerID="e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d" Apr 24 17:40:27.431320 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:40:27.431287 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d\": container with ID starting with e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d not found: ID does not exist" containerID="e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d" Apr 24 17:40:27.431382 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.431332 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d"} err="failed to get container status \"e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d\": rpc error: code = NotFound desc = could not find container \"e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d\": container with ID starting with e55676b664f62e60761ee34542a0facb2d3538b9b6055a9a1012c60a235b5b4d not found: ID does not exist" Apr 24 17:40:27.446883 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.446852 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t"] Apr 24 17:40:27.453108 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:27.453081 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-sxg9t"] Apr 24 17:40:28.216986 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:28.216943 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" path="/var/lib/kubelet/pods/668adc89-9b4c-44bf-90a2-afc4c038cd44/volumes" Apr 24 17:40:32.412094 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:32.412065 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:40:32.412748 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:32.412715 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:40:42.412627 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:42.412580 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:40:52.413218 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:40:52.413154 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:41:02.412683 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:02.412640 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:41:12.413265 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:12.413219 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:41:22.412914 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:22.412869 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 24 17:41:32.413359 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:32.413327 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:41:33.197204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.197165 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx"] Apr 24 17:41:33.197545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.197487 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" containerID="cri-o://8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4" gracePeriod=30 Apr 24 17:41:33.197545 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.197524 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kube-rbac-proxy" containerID="cri-o://0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1" gracePeriod=30 Apr 24 17:41:33.333722 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.333686 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz"] Apr 24 17:41:33.334049 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334032 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kube-rbac-proxy" Apr 24 17:41:33.334141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334052 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kube-rbac-proxy" Apr 24 17:41:33.334141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334083 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="storage-initializer" Apr 24 17:41:33.334141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334092 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="storage-initializer" Apr 24 17:41:33.334141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334107 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" Apr 24 17:41:33.334141 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334115 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" Apr 24 17:41:33.334436 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334196 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kube-rbac-proxy" Apr 24 17:41:33.334436 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.334209 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="668adc89-9b4c-44bf-90a2-afc4c038cd44" containerName="kserve-container" Apr 24 17:41:33.337343 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.337322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.340397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.340212 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 24 17:41:33.340397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.340384 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 24 17:41:33.340599 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.340568 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 17:41:33.347742 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.347711 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz"] Apr 24 17:41:33.373552 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.373512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zxdd\" (UniqueName: \"kubernetes.io/projected/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kube-api-access-5zxdd\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.373727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.373568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.373727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.373597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.373727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.373624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.373727 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.373679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.474730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.474613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.474730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.474676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.474730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.474700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.474730 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.474721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.475296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.474755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zxdd\" (UniqueName: \"kubernetes.io/projected/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kube-api-access-5zxdd\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.475296 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:41:33.474869 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 24 17:41:33.475296 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:41:33.474948 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls podName:6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1 nodeName:}" failed. No retries permitted until 2026-04-24 17:41:33.974927391 +0000 UTC m=+3752.239642766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls") pod "isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" (UID: "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1") : secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 24 17:41:33.475296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.475078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.475470 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.475375 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.475470 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.475429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.485567 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.485537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zxdd\" (UniqueName: \"kubernetes.io/projected/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kube-api-access-5zxdd\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.591722 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.591686 2573 generic.go:358] "Generic (PLEG): container finished" podID="d987bffd-5418-4a39-9ada-511d8423a40e" containerID="0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1" exitCode=2 Apr 24 17:41:33.591888 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.591754 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerDied","Data":"0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1"} Apr 24 17:41:33.979993 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.979953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:33.982606 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:33.982577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:34.253416 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:34.253379 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:34.377089 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:34.377058 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz"] Apr 24 17:41:34.379398 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:41:34.379357 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f5f6a76_a410_4fb0_bb9a_77705a4cf1a1.slice/crio-71ba1037d23487ec15ae7c3c944c8d721bf997d4003b26b82dd0001c41684b6e WatchSource:0}: Error finding container 71ba1037d23487ec15ae7c3c944c8d721bf997d4003b26b82dd0001c41684b6e: Status 404 returned error can't find the container with id 71ba1037d23487ec15ae7c3c944c8d721bf997d4003b26b82dd0001c41684b6e Apr 24 17:41:34.596356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:34.596241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerStarted","Data":"6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94"} Apr 24 17:41:34.596356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:34.596283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerStarted","Data":"71ba1037d23487ec15ae7c3c944c8d721bf997d4003b26b82dd0001c41684b6e"} Apr 24 17:41:35.600591 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:35.600554 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerID="6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94" exitCode=0 Apr 24 17:41:35.600987 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:35.600647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerDied","Data":"6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94"} Apr 24 17:41:36.605870 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:36.605832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerStarted","Data":"b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9"} Apr 24 17:41:36.605870 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:36.605872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerStarted","Data":"442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808"} Apr 24 17:41:36.606334 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:36.606016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:36.626254 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:36.626193 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podStartSLOduration=3.626173122 podStartE2EDuration="3.626173122s" podCreationTimestamp="2026-04-24 17:41:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:41:36.624450908 +0000 UTC m=+3754.889166304" watchObservedRunningTime="2026-04-24 17:41:36.626173122 +0000 UTC m=+3754.890888519" Apr 24 17:41:37.407825 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.407782 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.61:8643/healthz\": dial tcp 10.134.0.61:8643: connect: connection refused" Apr 24 17:41:37.611296 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.611264 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:37.612421 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.612392 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:41:37.742271 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.742247 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:41:37.811784 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.811748 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls\") pod \"d987bffd-5418-4a39-9ada-511d8423a40e\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " Apr 24 17:41:37.811954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.811791 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hgfb\" (UniqueName: \"kubernetes.io/projected/d987bffd-5418-4a39-9ada-511d8423a40e-kube-api-access-6hgfb\") pod \"d987bffd-5418-4a39-9ada-511d8423a40e\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " Apr 24 17:41:37.811954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.811817 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d987bffd-5418-4a39-9ada-511d8423a40e-kserve-provision-location\") pod \"d987bffd-5418-4a39-9ada-511d8423a40e\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " Apr 24 17:41:37.811954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.811872 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d987bffd-5418-4a39-9ada-511d8423a40e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"d987bffd-5418-4a39-9ada-511d8423a40e\" (UID: \"d987bffd-5418-4a39-9ada-511d8423a40e\") " Apr 24 17:41:37.812212 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.812187 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d987bffd-5418-4a39-9ada-511d8423a40e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d987bffd-5418-4a39-9ada-511d8423a40e" (UID: "d987bffd-5418-4a39-9ada-511d8423a40e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:41:37.812386 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.812276 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d987bffd-5418-4a39-9ada-511d8423a40e-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "d987bffd-5418-4a39-9ada-511d8423a40e" (UID: "d987bffd-5418-4a39-9ada-511d8423a40e"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:41:37.813958 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.813936 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d987bffd-5418-4a39-9ada-511d8423a40e" (UID: "d987bffd-5418-4a39-9ada-511d8423a40e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:41:37.814148 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.814131 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d987bffd-5418-4a39-9ada-511d8423a40e-kube-api-access-6hgfb" (OuterVolumeSpecName: "kube-api-access-6hgfb") pod "d987bffd-5418-4a39-9ada-511d8423a40e" (UID: "d987bffd-5418-4a39-9ada-511d8423a40e"). InnerVolumeSpecName "kube-api-access-6hgfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:41:37.913053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.913008 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d987bffd-5418-4a39-9ada-511d8423a40e-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:41:37.913053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.913043 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hgfb\" (UniqueName: \"kubernetes.io/projected/d987bffd-5418-4a39-9ada-511d8423a40e-kube-api-access-6hgfb\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:41:37.913053 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.913055 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d987bffd-5418-4a39-9ada-511d8423a40e-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:41:37.913303 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:37.913066 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d987bffd-5418-4a39-9ada-511d8423a40e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:41:38.615872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.615835 2573 generic.go:358] "Generic (PLEG): container finished" podID="d987bffd-5418-4a39-9ada-511d8423a40e" containerID="8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4" exitCode=0 Apr 24 17:41:38.616303 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.615915 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerDied","Data":"8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4"} Apr 24 17:41:38.616303 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.615948 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" event={"ID":"d987bffd-5418-4a39-9ada-511d8423a40e","Type":"ContainerDied","Data":"d8da63ff693c7b7d9f814ab7646947395b447df4a24b698d172843aaee945a5e"} Apr 24 17:41:38.616303 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.615963 2573 scope.go:117] "RemoveContainer" containerID="0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1" Apr 24 17:41:38.616303 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.615962 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx" Apr 24 17:41:38.616567 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.616514 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:41:38.624063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.624035 2573 scope.go:117] "RemoveContainer" containerID="8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4" Apr 24 17:41:38.631586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.631561 2573 scope.go:117] "RemoveContainer" containerID="edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3" Apr 24 17:41:38.632885 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.632856 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx"] Apr 24 17:41:38.636694 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.636671 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-857d79f78-fvmqx"] Apr 24 17:41:38.639460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.639441 2573 scope.go:117] "RemoveContainer" containerID="0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1" Apr 24 17:41:38.639747 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:41:38.639717 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1\": container with ID starting with 0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1 not found: ID does not exist" containerID="0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1" Apr 24 17:41:38.639799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.639748 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1"} err="failed to get container status \"0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1\": rpc error: code = NotFound desc = could not find container \"0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1\": container with ID starting with 0daa0e64fbb3fc6b5f28c8c56468d8c52165607284068f26008f9ab687b4e2b1 not found: ID does not exist" Apr 24 17:41:38.639799 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.639768 2573 scope.go:117] "RemoveContainer" containerID="8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4" Apr 24 17:41:38.640031 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:41:38.640013 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4\": container with ID starting with 8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4 not found: ID does not exist" containerID="8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4" Apr 24 17:41:38.640086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.640036 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4"} err="failed to get container status \"8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4\": rpc error: code = NotFound desc = could not find container \"8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4\": container with ID starting with 8cd976432385b2d438c7b13b5f79dbf06d9b13da2f8e0741a91b62729bd327a4 not found: ID does not exist" Apr 24 17:41:38.640086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.640051 2573 scope.go:117] "RemoveContainer" containerID="edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3" Apr 24 17:41:38.640229 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:41:38.640215 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3\": container with ID starting with edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3 not found: ID does not exist" containerID="edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3" Apr 24 17:41:38.640268 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:38.640231 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3"} err="failed to get container status \"edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3\": rpc error: code = NotFound desc = could not find container \"edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3\": container with ID starting with edceab4bb0b3eee7643d3146d26428982cd55521d794fb948337b827930859f3 not found: ID does not exist" Apr 24 17:41:40.216968 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:40.216934 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" path="/var/lib/kubelet/pods/d987bffd-5418-4a39-9ada-511d8423a40e/volumes" Apr 24 17:41:43.620772 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:43.620744 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:41:43.621420 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:43.621389 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:41:53.621398 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:41:53.621288 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:42:03.621560 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:03.621516 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:42:13.622119 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:13.622076 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:42:23.622146 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:23.622093 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:42:33.621982 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:33.621937 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:42:43.622462 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:43.622432 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:42:53.363772 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.363735 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz"] Apr 24 17:42:53.364211 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.364075 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" containerID="cri-o://442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808" gracePeriod=30 Apr 24 17:42:53.364211 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.364143 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kube-rbac-proxy" containerID="cri-o://b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9" gracePeriod=30 Apr 24 17:42:53.617589 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.617488 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.62:8643/healthz\": dial tcp 10.134.0.62:8643: connect: connection refused" Apr 24 17:42:53.621924 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.621877 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 24 17:42:53.836634 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.836597 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerID="b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9" exitCode=2 Apr 24 17:42:53.836804 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:53.836675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerDied","Data":"b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9"} Apr 24 17:42:54.467184 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467144 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9"] Apr 24 17:42:54.467692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467621 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="storage-initializer" Apr 24 17:42:54.467692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467642 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="storage-initializer" Apr 24 17:42:54.467692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467653 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" Apr 24 17:42:54.467692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467661 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" Apr 24 17:42:54.467692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467677 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kube-rbac-proxy" Apr 24 17:42:54.467692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467686 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kube-rbac-proxy" Apr 24 17:42:54.468002 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467764 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kube-rbac-proxy" Apr 24 17:42:54.468002 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.467776 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d987bffd-5418-4a39-9ada-511d8423a40e" containerName="kserve-container" Apr 24 17:42:54.470919 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.470899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.473497 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.473478 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 24 17:42:54.473835 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.473814 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 24 17:42:54.480122 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.480101 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9"] Apr 24 17:42:54.531226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.531182 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939dee20-60d6-40f2-9efd-6fa17014212c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.531226 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.531230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939dee20-60d6-40f2-9efd-6fa17014212c-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.531569 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.531375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2fsn\" (UniqueName: \"kubernetes.io/projected/939dee20-60d6-40f2-9efd-6fa17014212c-kube-api-access-t2fsn\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.531569 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.531434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939dee20-60d6-40f2-9efd-6fa17014212c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.632664 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.632629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2fsn\" (UniqueName: \"kubernetes.io/projected/939dee20-60d6-40f2-9efd-6fa17014212c-kube-api-access-t2fsn\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.632873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.632673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939dee20-60d6-40f2-9efd-6fa17014212c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.632873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.632726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939dee20-60d6-40f2-9efd-6fa17014212c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.632873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.632746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939dee20-60d6-40f2-9efd-6fa17014212c-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.633193 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.633170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939dee20-60d6-40f2-9efd-6fa17014212c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.633391 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.633370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939dee20-60d6-40f2-9efd-6fa17014212c-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.635357 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.635335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939dee20-60d6-40f2-9efd-6fa17014212c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.642253 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.642232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2fsn\" (UniqueName: \"kubernetes.io/projected/939dee20-60d6-40f2-9efd-6fa17014212c-kube-api-access-t2fsn\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.781590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.781537 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:42:54.908060 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:54.908034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9"] Apr 24 17:42:54.910745 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:42:54.910718 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939dee20_60d6_40f2_9efd_6fa17014212c.slice/crio-d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7 WatchSource:0}: Error finding container d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7: Status 404 returned error can't find the container with id d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7 Apr 24 17:42:55.844018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:55.843976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" event={"ID":"939dee20-60d6-40f2-9efd-6fa17014212c","Type":"ContainerStarted","Data":"05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b"} Apr 24 17:42:55.844018 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:55.844022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" event={"ID":"939dee20-60d6-40f2-9efd-6fa17014212c","Type":"ContainerStarted","Data":"d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7"} Apr 24 17:42:57.806796 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.806767 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:42:57.851244 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.851207 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerID="442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808" exitCode=0 Apr 24 17:42:57.851443 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.851249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerDied","Data":"442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808"} Apr 24 17:42:57.851443 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.851277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" event={"ID":"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1","Type":"ContainerDied","Data":"71ba1037d23487ec15ae7c3c944c8d721bf997d4003b26b82dd0001c41684b6e"} Apr 24 17:42:57.851443 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.851292 2573 scope.go:117] "RemoveContainer" containerID="b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9" Apr 24 17:42:57.851443 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.851372 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz" Apr 24 17:42:57.856025 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856001 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls\") pod \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " Apr 24 17:42:57.856156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856098 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " Apr 24 17:42:57.856156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856124 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kserve-provision-location\") pod \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " Apr 24 17:42:57.856156 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856149 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zxdd\" (UniqueName: \"kubernetes.io/projected/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kube-api-access-5zxdd\") pod \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " Apr 24 17:42:57.856356 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856163 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-cabundle-cert\") pod \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\" (UID: \"6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1\") " Apr 24 17:42:57.856559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856527 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" (UID: "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:42:57.856668 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856568 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" (UID: "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:42:57.856668 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.856606 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" (UID: "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:42:57.858260 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.858232 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" (UID: "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:42:57.858382 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.858364 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kube-api-access-5zxdd" (OuterVolumeSpecName: "kube-api-access-5zxdd") pod "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" (UID: "6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1"). InnerVolumeSpecName "kube-api-access-5zxdd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:42:57.863101 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.863055 2573 scope.go:117] "RemoveContainer" containerID="442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808" Apr 24 17:42:57.872001 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.871974 2573 scope.go:117] "RemoveContainer" containerID="6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94" Apr 24 17:42:57.879040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.879018 2573 scope.go:117] "RemoveContainer" containerID="b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9" Apr 24 17:42:57.879319 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:42:57.879285 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9\": container with ID starting with b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9 not found: ID does not exist" containerID="b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9" Apr 24 17:42:57.879379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.879332 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9"} err="failed to get container status \"b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9\": rpc error: code = NotFound desc = could not find container \"b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9\": container with ID starting with b3b24d91b2cfbd90e4a98db0b74d34f695d153da00241db78ecd556cf0d7b1a9 not found: ID does not exist" Apr 24 17:42:57.879379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.879352 2573 scope.go:117] "RemoveContainer" containerID="442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808" Apr 24 17:42:57.879578 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:42:57.879557 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808\": container with ID starting with 442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808 not found: ID does not exist" containerID="442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808" Apr 24 17:42:57.879679 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.879581 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808"} err="failed to get container status \"442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808\": rpc error: code = NotFound desc = could not find container \"442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808\": container with ID starting with 442a0c90a9069f86f56c4c3ba11d12f68bdebdcfc8fe3a3f8dc20a80c1698808 not found: ID does not exist" Apr 24 17:42:57.879679 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.879600 2573 scope.go:117] "RemoveContainer" containerID="6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94" Apr 24 17:42:57.879821 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:42:57.879801 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94\": container with ID starting with 6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94 not found: ID does not exist" containerID="6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94" Apr 24 17:42:57.879859 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.879829 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94"} err="failed to get container status \"6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94\": rpc error: code = NotFound desc = could not find container \"6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94\": container with ID starting with 6126a2e5e7641fdf787ee57d3fb865a7e86a31d3c04675605acc53b8689e5b94 not found: ID does not exist" Apr 24 17:42:57.957564 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.957474 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zxdd\" (UniqueName: \"kubernetes.io/projected/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kube-api-access-5zxdd\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:42:57.957564 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.957505 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-cabundle-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:42:57.957564 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.957517 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:42:57.957564 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.957526 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:42:57.957564 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:57.957535 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:42:58.173021 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:58.172982 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz"] Apr 24 17:42:58.176467 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:58.176440 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5fc97ccd8b-8tsmz"] Apr 24 17:42:58.216456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:58.216373 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" path="/var/lib/kubelet/pods/6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1/volumes" Apr 24 17:42:58.855632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:58.855553 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/0.log" Apr 24 17:42:58.855632 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:58.855591 2573 generic.go:358] "Generic (PLEG): container finished" podID="939dee20-60d6-40f2-9efd-6fa17014212c" containerID="05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b" exitCode=1 Apr 24 17:42:58.856058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:58.855669 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" event={"ID":"939dee20-60d6-40f2-9efd-6fa17014212c","Type":"ContainerDied","Data":"05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b"} Apr 24 17:42:59.861270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:59.861237 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/0.log" Apr 24 17:42:59.861675 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:42:59.861298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" event={"ID":"939dee20-60d6-40f2-9efd-6fa17014212c","Type":"ContainerStarted","Data":"740ffbb2fc7841fd9269158186ec0e462f7f610c9ff691cadf626961d710774b"} Apr 24 17:43:01.868549 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:01.868521 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/1.log" Apr 24 17:43:01.868936 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:01.868836 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/0.log" Apr 24 17:43:01.868936 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:01.868873 2573 generic.go:358] "Generic (PLEG): container finished" podID="939dee20-60d6-40f2-9efd-6fa17014212c" containerID="740ffbb2fc7841fd9269158186ec0e462f7f610c9ff691cadf626961d710774b" exitCode=1 Apr 24 17:43:01.869011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:01.868950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" event={"ID":"939dee20-60d6-40f2-9efd-6fa17014212c","Type":"ContainerDied","Data":"740ffbb2fc7841fd9269158186ec0e462f7f610c9ff691cadf626961d710774b"} Apr 24 17:43:01.869011 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:01.868991 2573 scope.go:117] "RemoveContainer" containerID="05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b" Apr 24 17:43:01.869410 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:01.869394 2573 scope.go:117] "RemoveContainer" containerID="05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b" Apr 24 17:43:01.879248 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:43:01.879215 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_kserve-ci-e2e-test_939dee20-60d6-40f2-9efd-6fa17014212c_0 in pod sandbox d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7 from index: no such id: '05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b'" containerID="05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b" Apr 24 17:43:01.879343 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:43:01.879268 2573 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_kserve-ci-e2e-test_939dee20-60d6-40f2-9efd-6fa17014212c_0 in pod sandbox d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7 from index: no such id: '05d34122de5436a90a0454b103a52fc9023c6836955378eabcae127abf11037b'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_kserve-ci-e2e-test(939dee20-60d6-40f2-9efd-6fa17014212c)\"" logger="UnhandledError" Apr 24 17:43:01.880575 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:43:01.880553 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_kserve-ci-e2e-test(939dee20-60d6-40f2-9efd-6fa17014212c)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" Apr 24 17:43:02.873572 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:02.873536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/1.log" Apr 24 17:43:04.474689 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.474645 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9"] Apr 24 17:43:04.597973 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.597947 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/1.log" Apr 24 17:43:04.598086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.598014 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:43:04.709359 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.709298 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939dee20-60d6-40f2-9efd-6fa17014212c-proxy-tls\") pod \"939dee20-60d6-40f2-9efd-6fa17014212c\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " Apr 24 17:43:04.709615 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.709407 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939dee20-60d6-40f2-9efd-6fa17014212c-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"939dee20-60d6-40f2-9efd-6fa17014212c\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " Apr 24 17:43:04.709615 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.709436 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2fsn\" (UniqueName: \"kubernetes.io/projected/939dee20-60d6-40f2-9efd-6fa17014212c-kube-api-access-t2fsn\") pod \"939dee20-60d6-40f2-9efd-6fa17014212c\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " Apr 24 17:43:04.709615 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.709458 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939dee20-60d6-40f2-9efd-6fa17014212c-kserve-provision-location\") pod \"939dee20-60d6-40f2-9efd-6fa17014212c\" (UID: \"939dee20-60d6-40f2-9efd-6fa17014212c\") " Apr 24 17:43:04.709838 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.709737 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/939dee20-60d6-40f2-9efd-6fa17014212c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "939dee20-60d6-40f2-9efd-6fa17014212c" (UID: "939dee20-60d6-40f2-9efd-6fa17014212c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:43:04.709900 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.709874 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939dee20-60d6-40f2-9efd-6fa17014212c-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "939dee20-60d6-40f2-9efd-6fa17014212c" (UID: "939dee20-60d6-40f2-9efd-6fa17014212c"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:43:04.711635 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.711612 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/939dee20-60d6-40f2-9efd-6fa17014212c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "939dee20-60d6-40f2-9efd-6fa17014212c" (UID: "939dee20-60d6-40f2-9efd-6fa17014212c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:43:04.711781 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.711763 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939dee20-60d6-40f2-9efd-6fa17014212c-kube-api-access-t2fsn" (OuterVolumeSpecName: "kube-api-access-t2fsn") pod "939dee20-60d6-40f2-9efd-6fa17014212c" (UID: "939dee20-60d6-40f2-9efd-6fa17014212c"). InnerVolumeSpecName "kube-api-access-t2fsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:43:04.810276 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.810221 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/939dee20-60d6-40f2-9efd-6fa17014212c-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:43:04.810276 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.810269 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2fsn\" (UniqueName: \"kubernetes.io/projected/939dee20-60d6-40f2-9efd-6fa17014212c-kube-api-access-t2fsn\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:43:04.810276 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.810281 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/939dee20-60d6-40f2-9efd-6fa17014212c-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:43:04.810276 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.810290 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/939dee20-60d6-40f2-9efd-6fa17014212c-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:43:04.880744 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.880717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9_939dee20-60d6-40f2-9efd-6fa17014212c/storage-initializer/1.log" Apr 24 17:43:04.880907 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.880860 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" Apr 24 17:43:04.880956 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.880856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9" event={"ID":"939dee20-60d6-40f2-9efd-6fa17014212c","Type":"ContainerDied","Data":"d7bfdf0c9998402a20221c68d53ee6a2278385aa9d11c323ba43793064e516b7"} Apr 24 17:43:04.880993 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.880981 2573 scope.go:117] "RemoveContainer" containerID="740ffbb2fc7841fd9269158186ec0e462f7f610c9ff691cadf626961d710774b" Apr 24 17:43:04.914498 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.914465 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9"] Apr 24 17:43:04.918416 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:04.918388 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-5fbf87dd-pxqn9"] Apr 24 17:43:05.547723 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.547689 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9"] Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.547975 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="storage-initializer" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.547991 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="storage-initializer" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548001 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548007 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548016 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" containerName="storage-initializer" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548022 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" containerName="storage-initializer" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548032 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" containerName="storage-initializer" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548039 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" containerName="storage-initializer" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548049 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kube-rbac-proxy" Apr 24 17:43:05.548075 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548054 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kube-rbac-proxy" Apr 24 17:43:05.548514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548110 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" containerName="storage-initializer" Apr 24 17:43:05.548514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548119 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kserve-container" Apr 24 17:43:05.548514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548129 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f5f6a76-a410-4fb0-bb9a-77705a4cf1a1" containerName="kube-rbac-proxy" Apr 24 17:43:05.548514 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.548221 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" containerName="storage-initializer" Apr 24 17:43:05.552643 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.552619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.555030 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.555005 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 17:43:05.555170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.555035 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 24 17:43:05.555170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.555018 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 24 17:43:05.555170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.555011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 17:43:05.555874 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.555856 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 17:43:05.555973 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.555901 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2jnrr\"" Apr 24 17:43:05.556113 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.556096 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 17:43:05.561944 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.561922 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9"] Apr 24 17:43:05.616534 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.616496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5f5\" (UniqueName: \"kubernetes.io/projected/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kube-api-access-xz5f5\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.616712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.616544 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.616712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.616611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.616712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.616661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.616712 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.616693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.717396 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.717355 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.717559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.717420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.717559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.717466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5f5\" (UniqueName: \"kubernetes.io/projected/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kube-api-access-xz5f5\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.717559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.717506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.717559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.717551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.717930 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.717904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.718230 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.718202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.718301 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.718221 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.719969 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.719953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.725973 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.725948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5f5\" (UniqueName: \"kubernetes.io/projected/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kube-api-access-xz5f5\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.863631 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.863540 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:05.986374 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:05.986344 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9"] Apr 24 17:43:05.988596 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:43:05.988565 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb63748f_5a6b_41f3_8e4b_c85ac2bc283c.slice/crio-97efaf960e031977001b285515b6a33dd4300d9ede7469a14e1275e367fd7c80 WatchSource:0}: Error finding container 97efaf960e031977001b285515b6a33dd4300d9ede7469a14e1275e367fd7c80: Status 404 returned error can't find the container with id 97efaf960e031977001b285515b6a33dd4300d9ede7469a14e1275e367fd7c80 Apr 24 17:43:06.216486 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:06.216408 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939dee20-60d6-40f2-9efd-6fa17014212c" path="/var/lib/kubelet/pods/939dee20-60d6-40f2-9efd-6fa17014212c/volumes" Apr 24 17:43:06.890583 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:06.890544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerStarted","Data":"3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90"} Apr 24 17:43:06.890583 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:06.890589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerStarted","Data":"97efaf960e031977001b285515b6a33dd4300d9ede7469a14e1275e367fd7c80"} Apr 24 17:43:07.895204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:07.895169 2573 generic.go:358] "Generic (PLEG): container finished" podID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerID="3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90" exitCode=0 Apr 24 17:43:07.895621 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:07.895221 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerDied","Data":"3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90"} Apr 24 17:43:08.899940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:08.899902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerStarted","Data":"948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7"} Apr 24 17:43:08.899940 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:08.899940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerStarted","Data":"b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8"} Apr 24 17:43:08.900379 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:08.900111 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:08.918537 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:08.918488 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podStartSLOduration=3.918472548 podStartE2EDuration="3.918472548s" podCreationTimestamp="2026-04-24 17:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:43:08.91797536 +0000 UTC m=+3847.182690959" watchObservedRunningTime="2026-04-24 17:43:08.918472548 +0000 UTC m=+3847.183187942" Apr 24 17:43:09.902622 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:09.902591 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:09.903896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:09.903866 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:43:10.906061 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:10.906011 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:43:15.910823 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:15.910793 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:43:15.911483 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:15.911449 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:43:25.911630 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:25.911585 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:43:35.911441 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:35.911396 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:43:45.911546 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:45.911500 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:43:55.912343 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:43:55.912282 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:44:05.911539 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:05.911497 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:44:07.781284 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:07.781252 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:44:07.789872 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:07.789843 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:44:15.912060 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:15.912028 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:44:25.568279 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:25.568238 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9"] Apr 24 17:44:25.568696 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:25.568598 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" containerID="cri-o://b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8" gracePeriod=30 Apr 24 17:44:25.568696 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:25.568647 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kube-rbac-proxy" containerID="cri-o://948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7" gracePeriod=30 Apr 24 17:44:25.907159 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:25.907052 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.64:8643/healthz\": dial tcp 10.134.0.64:8643: connect: connection refused" Apr 24 17:44:25.911477 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:25.911444 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 24 17:44:26.118686 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.118653 2573 generic.go:358] "Generic (PLEG): container finished" podID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerID="948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7" exitCode=2 Apr 24 17:44:26.118846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.118720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerDied","Data":"948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7"} Apr 24 17:44:26.692145 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.692108 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8"] Apr 24 17:44:26.695565 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.695546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.697869 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.697845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 24 17:44:26.697996 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.697909 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 24 17:44:26.709114 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.709085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8"] Apr 24 17:44:26.785238 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.785206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86p5g\" (UniqueName: \"kubernetes.io/projected/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kube-api-access-86p5g\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.785238 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.785247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.785464 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.785274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.785464 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.785293 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.886599 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.886562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.886732 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.886638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86p5g\" (UniqueName: \"kubernetes.io/projected/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kube-api-access-86p5g\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.886732 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.886666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.886732 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.886695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.887046 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.887019 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.887622 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.887569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.889250 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.889226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:26.895710 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:26.895676 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86p5g\" (UniqueName: \"kubernetes.io/projected/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kube-api-access-86p5g\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:27.005862 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:27.005815 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:27.131230 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:27.131195 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8"] Apr 24 17:44:27.135214 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:44:27.135177 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d397b5f_6ad3_4fc0_8f4b_7cfae1ef1128.slice/crio-305dc3dfeeef7a4401db02b2d6d7d87c6c93108c4c1fc19b1bbff5b1a66bf4c1 WatchSource:0}: Error finding container 305dc3dfeeef7a4401db02b2d6d7d87c6c93108c4c1fc19b1bbff5b1a66bf4c1: Status 404 returned error can't find the container with id 305dc3dfeeef7a4401db02b2d6d7d87c6c93108c4c1fc19b1bbff5b1a66bf4c1 Apr 24 17:44:27.137070 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:27.137050 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:44:28.125836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:28.125794 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" event={"ID":"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128","Type":"ContainerStarted","Data":"7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b"} Apr 24 17:44:28.125836 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:28.125836 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" event={"ID":"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128","Type":"ContainerStarted","Data":"305dc3dfeeef7a4401db02b2d6d7d87c6c93108c4c1fc19b1bbff5b1a66bf4c1"} Apr 24 17:44:30.115454 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.115424 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:44:30.137767 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.137711 2573 generic.go:358] "Generic (PLEG): container finished" podID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerID="b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8" exitCode=0 Apr 24 17:44:30.137938 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.137754 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerDied","Data":"b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8"} Apr 24 17:44:30.137938 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.137821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" event={"ID":"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c","Type":"ContainerDied","Data":"97efaf960e031977001b285515b6a33dd4300d9ede7469a14e1275e367fd7c80"} Apr 24 17:44:30.137938 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.137868 2573 scope.go:117] "RemoveContainer" containerID="948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7" Apr 24 17:44:30.138104 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.138033 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9" Apr 24 17:44:30.146980 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.146955 2573 scope.go:117] "RemoveContainer" containerID="b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8" Apr 24 17:44:30.154802 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.154778 2573 scope.go:117] "RemoveContainer" containerID="3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90" Apr 24 17:44:30.162567 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.162544 2573 scope.go:117] "RemoveContainer" containerID="948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7" Apr 24 17:44:30.162910 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:44:30.162890 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7\": container with ID starting with 948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7 not found: ID does not exist" containerID="948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7" Apr 24 17:44:30.162984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.162921 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7"} err="failed to get container status \"948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7\": rpc error: code = NotFound desc = could not find container \"948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7\": container with ID starting with 948250c386902097048508cccb92ab19a74ffe529637a5bd08eb9814464224d7 not found: ID does not exist" Apr 24 17:44:30.162984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.162940 2573 scope.go:117] "RemoveContainer" containerID="b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8" Apr 24 17:44:30.163206 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:44:30.163187 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8\": container with ID starting with b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8 not found: ID does not exist" containerID="b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8" Apr 24 17:44:30.163266 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.163213 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8"} err="failed to get container status \"b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8\": rpc error: code = NotFound desc = could not find container \"b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8\": container with ID starting with b20085b116d37aea645317456635fc3217e0f0ce68253cc70f9513e461229db8 not found: ID does not exist" Apr 24 17:44:30.163266 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.163228 2573 scope.go:117] "RemoveContainer" containerID="3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90" Apr 24 17:44:30.163493 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:44:30.163476 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90\": container with ID starting with 3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90 not found: ID does not exist" containerID="3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90" Apr 24 17:44:30.163543 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.163500 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90"} err="failed to get container status \"3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90\": rpc error: code = NotFound desc = could not find container \"3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90\": container with ID starting with 3af9a231278c58f194817526c8e19013f5b013bc08d4ca1ccdaf185166760c90 not found: ID does not exist" Apr 24 17:44:30.214788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.214699 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-cabundle-cert\") pod \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " Apr 24 17:44:30.214788 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.214744 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " Apr 24 17:44:30.215039 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.214797 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-proxy-tls\") pod \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " Apr 24 17:44:30.215039 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.214841 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz5f5\" (UniqueName: \"kubernetes.io/projected/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kube-api-access-xz5f5\") pod \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " Apr 24 17:44:30.215039 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.214877 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kserve-provision-location\") pod \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\" (UID: \"eb63748f-5a6b-41f3-8e4b-c85ac2bc283c\") " Apr 24 17:44:30.215204 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.215102 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" (UID: "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:44:30.215290 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.215266 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" (UID: "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:44:30.215368 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.215276 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" (UID: "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:44:30.217112 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.217087 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" (UID: "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:44:30.217224 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.217109 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kube-api-access-xz5f5" (OuterVolumeSpecName: "kube-api-access-xz5f5") pod "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" (UID: "eb63748f-5a6b-41f3-8e4b-c85ac2bc283c"). InnerVolumeSpecName "kube-api-access-xz5f5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:44:30.316040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.315995 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xz5f5\" (UniqueName: \"kubernetes.io/projected/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kube-api-access-xz5f5\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:30.316040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.316032 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:30.316040 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.316045 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-cabundle-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:30.316283 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.316060 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:30.316283 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.316072 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:30.459542 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.459511 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9"] Apr 24 17:44:30.463344 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:30.463295 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7559cfcdc4-8m4r9"] Apr 24 17:44:32.145494 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:32.145467 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8_1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/storage-initializer/0.log" Apr 24 17:44:32.145862 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:32.145504 2573 generic.go:358] "Generic (PLEG): container finished" podID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerID="7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b" exitCode=1 Apr 24 17:44:32.145862 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:32.145552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" event={"ID":"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128","Type":"ContainerDied","Data":"7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b"} Apr 24 17:44:32.217578 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:32.217540 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" path="/var/lib/kubelet/pods/eb63748f-5a6b-41f3-8e4b-c85ac2bc283c/volumes" Apr 24 17:44:33.149246 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:33.149216 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8_1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/storage-initializer/0.log" Apr 24 17:44:33.149612 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:33.149291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" event={"ID":"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128","Type":"ContainerStarted","Data":"19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc"} Apr 24 17:44:36.359036 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:44:36.358996 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d397b5f_6ad3_4fc0_8f4b_7cfae1ef1128.slice/crio-conmon-19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d397b5f_6ad3_4fc0_8f4b_7cfae1ef1128.slice/crio-19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc.scope\": RecentStats: unable to find data in memory cache]" Apr 24 17:44:36.640197 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.640096 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8"] Apr 24 17:44:36.640518 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.640480 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" containerID="cri-o://19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc" gracePeriod=30 Apr 24 17:44:36.765208 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.765184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8_1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/storage-initializer/1.log" Apr 24 17:44:36.765578 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.765563 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8_1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/storage-initializer/0.log" Apr 24 17:44:36.765645 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.765633 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:36.870745 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.870704 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86p5g\" (UniqueName: \"kubernetes.io/projected/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kube-api-access-86p5g\") pod \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " Apr 24 17:44:36.870946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.870772 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-proxy-tls\") pod \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " Apr 24 17:44:36.870946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.870797 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " Apr 24 17:44:36.870946 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.870817 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kserve-provision-location\") pod \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\" (UID: \"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128\") " Apr 24 17:44:36.871190 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.871160 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" (UID: "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:44:36.871349 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.871166 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" (UID: "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:44:36.872941 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.872910 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kube-api-access-86p5g" (OuterVolumeSpecName: "kube-api-access-86p5g") pod "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" (UID: "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128"). InnerVolumeSpecName "kube-api-access-86p5g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:44:36.873033 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.872942 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" (UID: "1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:44:36.971724 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.971631 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:36.971724 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.971664 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-86p5g\" (UniqueName: \"kubernetes.io/projected/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-kube-api-access-86p5g\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:36.971724 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.971675 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:36.971724 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:36.971685 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:44:37.161471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161443 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8_1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/storage-initializer/1.log" Apr 24 17:44:37.161816 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161801 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8_1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/storage-initializer/0.log" Apr 24 17:44:37.161861 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161837 2573 generic.go:358] "Generic (PLEG): container finished" podID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerID="19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc" exitCode=1 Apr 24 17:44:37.161896 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" event={"ID":"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128","Type":"ContainerDied","Data":"19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc"} Apr 24 17:44:37.161927 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" event={"ID":"1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128","Type":"ContainerDied","Data":"305dc3dfeeef7a4401db02b2d6d7d87c6c93108c4c1fc19b1bbff5b1a66bf4c1"} Apr 24 17:44:37.161927 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161917 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8" Apr 24 17:44:37.161994 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.161927 2573 scope.go:117] "RemoveContainer" containerID="19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc" Apr 24 17:44:37.170573 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.170555 2573 scope.go:117] "RemoveContainer" containerID="7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b" Apr 24 17:44:37.177269 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.177251 2573 scope.go:117] "RemoveContainer" containerID="19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc" Apr 24 17:44:37.177556 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:44:37.177538 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc\": container with ID starting with 19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc not found: ID does not exist" containerID="19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc" Apr 24 17:44:37.177630 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.177563 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc"} err="failed to get container status \"19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc\": rpc error: code = NotFound desc = could not find container \"19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc\": container with ID starting with 19666669f3c723b8ad654ace9d5e2b8201e5c9237a58c519b4a85f80b849a9bc not found: ID does not exist" Apr 24 17:44:37.177630 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.177580 2573 scope.go:117] "RemoveContainer" containerID="7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b" Apr 24 17:44:37.177806 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:44:37.177786 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b\": container with ID starting with 7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b not found: ID does not exist" containerID="7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b" Apr 24 17:44:37.177846 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.177814 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b"} err="failed to get container status \"7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b\": rpc error: code = NotFound desc = could not find container \"7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b\": container with ID starting with 7ff8f3cfe0d9cdcd72a0d82e1d166a3da3f92444a71e60fbe9c4ba4090dbee9b not found: ID does not exist" Apr 24 17:44:37.198517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.198484 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8"] Apr 24 17:44:37.201706 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.201681 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5475696f5b-tvwm8"] Apr 24 17:44:37.728692 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728657 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk"] Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728920 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728931 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728951 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="storage-initializer" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728959 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="storage-initializer" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728973 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728982 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.728996 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729005 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729017 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kube-rbac-proxy" Apr 24 17:44:37.729063 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729024 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kube-rbac-proxy" Apr 24 17:44:37.729460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729095 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kube-rbac-proxy" Apr 24 17:44:37.729460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729109 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" Apr 24 17:44:37.729460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729120 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb63748f-5a6b-41f3-8e4b-c85ac2bc283c" containerName="kserve-container" Apr 24 17:44:37.729460 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.729252 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" containerName="storage-initializer" Apr 24 17:44:37.733409 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.733391 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.735505 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.735479 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 24 17:44:37.735695 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.735674 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2jnrr\"" Apr 24 17:44:37.735784 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.735724 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 17:44:37.736529 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.736510 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 17:44:37.736633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.736615 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 24 17:44:37.736697 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.736646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 17:44:37.736751 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.736723 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 17:44:37.741526 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.741503 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk"] Apr 24 17:44:37.777117 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.777086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.777333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.777123 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.777333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.777154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.777333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.777190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hk8\" (UniqueName: \"kubernetes.io/projected/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kube-api-access-b5hk8\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.777333 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.777212 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.877614 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.877574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.877792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.877625 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.877792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.877655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hk8\" (UniqueName: \"kubernetes.io/projected/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kube-api-access-b5hk8\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.877792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.877687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.877792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.877776 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.878151 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.878127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.878287 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.878267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.878383 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.878361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.880434 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.880413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:37.885743 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:37.885725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hk8\" (UniqueName: \"kubernetes.io/projected/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kube-api-access-b5hk8\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:38.044559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:38.044517 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:38.164873 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:38.164841 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk"] Apr 24 17:44:38.168300 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:44:38.168275 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e197db2_9ed4_4a7b_9333_6b8d01b18a11.slice/crio-0a1a98f958bf80beda453cc4f68fd25b71356a765408985c7b7fd41c64ebb07c WatchSource:0}: Error finding container 0a1a98f958bf80beda453cc4f68fd25b71356a765408985c7b7fd41c64ebb07c: Status 404 returned error can't find the container with id 0a1a98f958bf80beda453cc4f68fd25b71356a765408985c7b7fd41c64ebb07c Apr 24 17:44:38.217159 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:38.217129 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128" path="/var/lib/kubelet/pods/1d397b5f-6ad3-4fc0-8f4b-7cfae1ef1128/volumes" Apr 24 17:44:39.171119 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:39.171087 2573 generic.go:358] "Generic (PLEG): container finished" podID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerID="2d40ac877066d86de0962cde3faa45d07afc1464bbdcc257de95d2fe6689999b" exitCode=0 Apr 24 17:44:39.171482 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:39.171143 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerDied","Data":"2d40ac877066d86de0962cde3faa45d07afc1464bbdcc257de95d2fe6689999b"} Apr 24 17:44:39.171482 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:39.171169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerStarted","Data":"0a1a98f958bf80beda453cc4f68fd25b71356a765408985c7b7fd41c64ebb07c"} Apr 24 17:44:40.181240 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:40.181200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerStarted","Data":"72f5d4d5abc29086c5f8cdae814489431aaac97fc9143d76471870602a61b643"} Apr 24 17:44:40.181240 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:40.181242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerStarted","Data":"2a3748b3d6acaae4b4babb3a076bf343173db920f08f589dec621ac5776b4a5c"} Apr 24 17:44:40.181721 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:40.181459 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:40.181721 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:40.181490 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:40.182655 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:40.182631 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:44:40.199784 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:40.199586 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podStartSLOduration=3.199570274 podStartE2EDuration="3.199570274s" podCreationTimestamp="2026-04-24 17:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:44:40.199141133 +0000 UTC m=+3938.463856528" watchObservedRunningTime="2026-04-24 17:44:40.199570274 +0000 UTC m=+3938.464285683" Apr 24 17:44:41.184681 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:41.184633 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:44:46.189170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:46.189142 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:44:46.189680 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:46.189655 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:44:56.189682 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:44:56.189593 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:45:06.190566 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:06.190525 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:45:16.189889 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:16.189850 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:45:26.189981 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:26.189939 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:45:36.189901 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:36.189860 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 24 17:45:46.190494 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:46.190463 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:45:47.751574 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:47.751544 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk"] Apr 24 17:45:47.751989 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:47.751824 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" containerID="cri-o://2a3748b3d6acaae4b4babb3a076bf343173db920f08f589dec621ac5776b4a5c" gracePeriod=30 Apr 24 17:45:47.751989 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:47.751875 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kube-rbac-proxy" containerID="cri-o://72f5d4d5abc29086c5f8cdae814489431aaac97fc9143d76471870602a61b643" gracePeriod=30 Apr 24 17:45:48.365714 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.365679 2573 generic.go:358] "Generic (PLEG): container finished" podID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerID="72f5d4d5abc29086c5f8cdae814489431aaac97fc9143d76471870602a61b643" exitCode=2 Apr 24 17:45:48.365891 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.365757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerDied","Data":"72f5d4d5abc29086c5f8cdae814489431aaac97fc9143d76471870602a61b643"} Apr 24 17:45:48.831003 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.830964 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk"] Apr 24 17:45:48.835026 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.835001 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.837086 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.837055 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 24 17:45:48.837212 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.837123 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 24 17:45:48.843057 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.843028 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk"] Apr 24 17:45:48.845339 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.845286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9915a312-367a-405f-bf4d-49c11f4a4684-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.845461 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.845364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxwg4\" (UniqueName: \"kubernetes.io/projected/9915a312-367a-405f-bf4d-49c11f4a4684-kube-api-access-qxwg4\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.845461 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.845397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.845461 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.845422 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9915a312-367a-405f-bf4d-49c11f4a4684-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.945962 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.945929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9915a312-367a-405f-bf4d-49c11f4a4684-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.945962 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.945967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxwg4\" (UniqueName: \"kubernetes.io/projected/9915a312-367a-405f-bf4d-49c11f4a4684-kube-api-access-qxwg4\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.946270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.945990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.946270 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.946007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9915a312-367a-405f-bf4d-49c11f4a4684-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.946270 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:45:48.946113 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 17:45:48.946270 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:45:48.946208 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls podName:9915a312-367a-405f-bf4d-49c11f4a4684 nodeName:}" failed. No retries permitted until 2026-04-24 17:45:49.446186464 +0000 UTC m=+4007.710901841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" (UID: "9915a312-367a-405f-bf4d-49c11f4a4684") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 17:45:48.946536 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.946435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9915a312-367a-405f-bf4d-49c11f4a4684-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.946652 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.946633 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9915a312-367a-405f-bf4d-49c11f4a4684-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:48.956602 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:48.956570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxwg4\" (UniqueName: \"kubernetes.io/projected/9915a312-367a-405f-bf4d-49c11f4a4684-kube-api-access-qxwg4\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:49.449277 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:49.449215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:49.451811 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:49.451775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:49.745687 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:49.745647 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:49.875684 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:49.875652 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk"] Apr 24 17:45:49.878668 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:45:49.878640 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9915a312_367a_405f_bf4d_49c11f4a4684.slice/crio-cd376be2ca2b21954e5301f197aaf54f33eef4851cc24f10ba52ac5e9e9ac8ef WatchSource:0}: Error finding container cd376be2ca2b21954e5301f197aaf54f33eef4851cc24f10ba52ac5e9e9ac8ef: Status 404 returned error can't find the container with id cd376be2ca2b21954e5301f197aaf54f33eef4851cc24f10ba52ac5e9e9ac8ef Apr 24 17:45:50.372845 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:50.372802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" event={"ID":"9915a312-367a-405f-bf4d-49c11f4a4684","Type":"ContainerStarted","Data":"f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148"} Apr 24 17:45:50.372845 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:50.372851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" event={"ID":"9915a312-367a-405f-bf4d-49c11f4a4684","Type":"ContainerStarted","Data":"cd376be2ca2b21954e5301f197aaf54f33eef4851cc24f10ba52ac5e9e9ac8ef"} Apr 24 17:45:51.185218 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:51.185167 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.66:8643/healthz\": dial tcp 10.134.0.66:8643: connect: connection refused" Apr 24 17:45:52.381218 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.381184 2573 generic.go:358] "Generic (PLEG): container finished" podID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerID="2a3748b3d6acaae4b4babb3a076bf343173db920f08f589dec621ac5776b4a5c" exitCode=0 Apr 24 17:45:52.381617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.381259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerDied","Data":"2a3748b3d6acaae4b4babb3a076bf343173db920f08f589dec621ac5776b4a5c"} Apr 24 17:45:52.381617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.381291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" event={"ID":"0e197db2-9ed4-4a7b-9333-6b8d01b18a11","Type":"ContainerDied","Data":"0a1a98f958bf80beda453cc4f68fd25b71356a765408985c7b7fd41c64ebb07c"} Apr 24 17:45:52.381617 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.381302 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a1a98f958bf80beda453cc4f68fd25b71356a765408985c7b7fd41c64ebb07c" Apr 24 17:45:52.387729 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.387709 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:45:52.471185 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471152 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kserve-provision-location\") pod \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " Apr 24 17:45:52.471185 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471195 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " Apr 24 17:45:52.471459 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471230 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-proxy-tls\") pod \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " Apr 24 17:45:52.471459 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471257 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-cabundle-cert\") pod \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " Apr 24 17:45:52.471459 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471292 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hk8\" (UniqueName: \"kubernetes.io/projected/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kube-api-access-b5hk8\") pod \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\" (UID: \"0e197db2-9ed4-4a7b-9333-6b8d01b18a11\") " Apr 24 17:45:52.471626 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471575 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e197db2-9ed4-4a7b-9333-6b8d01b18a11" (UID: "0e197db2-9ed4-4a7b-9333-6b8d01b18a11"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:45:52.471626 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471612 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "0e197db2-9ed4-4a7b-9333-6b8d01b18a11" (UID: "0e197db2-9ed4-4a7b-9333-6b8d01b18a11"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:45:52.471741 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.471698 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "0e197db2-9ed4-4a7b-9333-6b8d01b18a11" (UID: "0e197db2-9ed4-4a7b-9333-6b8d01b18a11"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:45:52.473583 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.473562 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0e197db2-9ed4-4a7b-9333-6b8d01b18a11" (UID: "0e197db2-9ed4-4a7b-9333-6b8d01b18a11"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:45:52.473650 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.473576 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kube-api-access-b5hk8" (OuterVolumeSpecName: "kube-api-access-b5hk8") pod "0e197db2-9ed4-4a7b-9333-6b8d01b18a11" (UID: "0e197db2-9ed4-4a7b-9333-6b8d01b18a11"). InnerVolumeSpecName "kube-api-access-b5hk8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:45:52.572076 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.572035 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:52.572076 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.572070 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:52.572076 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.572079 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-cabundle-cert\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:52.572340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.572088 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5hk8\" (UniqueName: \"kubernetes.io/projected/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kube-api-access-b5hk8\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:52.572340 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:52.572098 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e197db2-9ed4-4a7b-9333-6b8d01b18a11-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:53.385612 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:53.385581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk_9915a312-367a-405f-bf4d-49c11f4a4684/storage-initializer/0.log" Apr 24 17:45:53.385984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:53.385622 2573 generic.go:358] "Generic (PLEG): container finished" podID="9915a312-367a-405f-bf4d-49c11f4a4684" containerID="f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148" exitCode=1 Apr 24 17:45:53.385984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:53.385698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" event={"ID":"9915a312-367a-405f-bf4d-49c11f4a4684","Type":"ContainerDied","Data":"f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148"} Apr 24 17:45:53.385984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:53.385765 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk" Apr 24 17:45:53.426114 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:53.426081 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk"] Apr 24 17:45:53.428021 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:53.427997 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598dbb6f45-2w6rk"] Apr 24 17:45:54.216041 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:54.216008 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" path="/var/lib/kubelet/pods/0e197db2-9ed4-4a7b-9333-6b8d01b18a11/volumes" Apr 24 17:45:54.389880 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:54.389851 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk_9915a312-367a-405f-bf4d-49c11f4a4684/storage-initializer/0.log" Apr 24 17:45:54.390281 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:54.389912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" event={"ID":"9915a312-367a-405f-bf4d-49c11f4a4684","Type":"ContainerStarted","Data":"0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620"} Apr 24 17:45:58.589989 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:45:58.589954 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9915a312_367a_405f_bf4d_49c11f4a4684.slice/crio-0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620.scope\": RecentStats: unable to find data in memory cache]" Apr 24 17:45:58.799984 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:58.799943 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk"] Apr 24 17:45:58.800259 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:58.800234 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" containerID="cri-o://0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620" gracePeriod=30 Apr 24 17:45:58.930196 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:58.930170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk_9915a312-367a-405f-bf4d-49c11f4a4684/storage-initializer/1.log" Apr 24 17:45:58.930576 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:58.930560 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk_9915a312-367a-405f-bf4d-49c11f4a4684/storage-initializer/0.log" Apr 24 17:45:58.930637 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:58.930627 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:59.016368 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.016301 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9915a312-367a-405f-bf4d-49c11f4a4684-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"9915a312-367a-405f-bf4d-49c11f4a4684\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " Apr 24 17:45:59.016525 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.016400 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9915a312-367a-405f-bf4d-49c11f4a4684-kserve-provision-location\") pod \"9915a312-367a-405f-bf4d-49c11f4a4684\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " Apr 24 17:45:59.016525 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.016431 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxwg4\" (UniqueName: \"kubernetes.io/projected/9915a312-367a-405f-bf4d-49c11f4a4684-kube-api-access-qxwg4\") pod \"9915a312-367a-405f-bf4d-49c11f4a4684\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " Apr 24 17:45:59.016525 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.016482 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls\") pod \"9915a312-367a-405f-bf4d-49c11f4a4684\" (UID: \"9915a312-367a-405f-bf4d-49c11f4a4684\") " Apr 24 17:45:59.016719 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.016691 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9915a312-367a-405f-bf4d-49c11f4a4684-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9915a312-367a-405f-bf4d-49c11f4a4684" (UID: "9915a312-367a-405f-bf4d-49c11f4a4684"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:45:59.016780 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.016744 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9915a312-367a-405f-bf4d-49c11f4a4684-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "9915a312-367a-405f-bf4d-49c11f4a4684" (UID: "9915a312-367a-405f-bf4d-49c11f4a4684"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:45:59.018618 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.018591 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9915a312-367a-405f-bf4d-49c11f4a4684" (UID: "9915a312-367a-405f-bf4d-49c11f4a4684"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:45:59.018715 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.018671 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9915a312-367a-405f-bf4d-49c11f4a4684-kube-api-access-qxwg4" (OuterVolumeSpecName: "kube-api-access-qxwg4") pod "9915a312-367a-405f-bf4d-49c11f4a4684" (UID: "9915a312-367a-405f-bf4d-49c11f4a4684"). InnerVolumeSpecName "kube-api-access-qxwg4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:45:59.117588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.117482 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9915a312-367a-405f-bf4d-49c11f4a4684-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:59.117588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.117532 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9915a312-367a-405f-bf4d-49c11f4a4684-kserve-provision-location\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:59.117588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.117544 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qxwg4\" (UniqueName: \"kubernetes.io/projected/9915a312-367a-405f-bf4d-49c11f4a4684-kube-api-access-qxwg4\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:59.117588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.117553 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9915a312-367a-405f-bf4d-49c11f4a4684-proxy-tls\") on node \"ip-10-0-142-182.ec2.internal\" DevicePath \"\"" Apr 24 17:45:59.403177 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403096 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk_9915a312-367a-405f-bf4d-49c11f4a4684/storage-initializer/1.log" Apr 24 17:45:59.403516 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403499 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk_9915a312-367a-405f-bf4d-49c11f4a4684/storage-initializer/0.log" Apr 24 17:45:59.403581 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403537 2573 generic.go:358] "Generic (PLEG): container finished" podID="9915a312-367a-405f-bf4d-49c11f4a4684" containerID="0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620" exitCode=1 Apr 24 17:45:59.403618 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" event={"ID":"9915a312-367a-405f-bf4d-49c11f4a4684","Type":"ContainerDied","Data":"0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620"} Apr 24 17:45:59.403618 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" event={"ID":"9915a312-367a-405f-bf4d-49c11f4a4684","Type":"ContainerDied","Data":"cd376be2ca2b21954e5301f197aaf54f33eef4851cc24f10ba52ac5e9e9ac8ef"} Apr 24 17:45:59.403685 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403625 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk" Apr 24 17:45:59.403733 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.403628 2573 scope.go:117] "RemoveContainer" containerID="0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620" Apr 24 17:45:59.412170 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.412150 2573 scope.go:117] "RemoveContainer" containerID="f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148" Apr 24 17:45:59.419348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.419329 2573 scope.go:117] "RemoveContainer" containerID="0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620" Apr 24 17:45:59.419581 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:45:59.419565 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620\": container with ID starting with 0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620 not found: ID does not exist" containerID="0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620" Apr 24 17:45:59.419633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.419589 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620"} err="failed to get container status \"0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620\": rpc error: code = NotFound desc = could not find container \"0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620\": container with ID starting with 0aa6eceeb808db0baa900f1ef557ce44568d5fb213a6dcef0425a2371cafd620 not found: ID does not exist" Apr 24 17:45:59.419633 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.419606 2573 scope.go:117] "RemoveContainer" containerID="f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148" Apr 24 17:45:59.419831 ip-10-0-142-182 kubenswrapper[2573]: E0424 17:45:59.419814 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148\": container with ID starting with f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148 not found: ID does not exist" containerID="f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148" Apr 24 17:45:59.419880 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.419835 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148"} err="failed to get container status \"f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148\": rpc error: code = NotFound desc = could not find container \"f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148\": container with ID starting with f96516d9c8c703a7358d75072e978786fcc8f3b86f86740a48bf9eef74868148 not found: ID does not exist" Apr 24 17:45:59.437869 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.437838 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk"] Apr 24 17:45:59.442068 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:45:59.442038 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-695dddb755-bdnrk"] Apr 24 17:46:00.216792 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:00.216755 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" path="/var/lib/kubelet/pods/9915a312-367a-405f-bf4d-49c11f4a4684/volumes" Apr 24 17:46:29.376916 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.376877 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-twkp6/must-gather-qb959"] Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377256 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377273 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377300 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="storage-initializer" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377328 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="storage-initializer" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377341 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377352 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377363 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kube-rbac-proxy" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377370 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kube-rbac-proxy" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377439 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kube-rbac-proxy" Apr 24 17:46:29.377456 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377452 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e197db2-9ed4-4a7b-9333-6b8d01b18a11" containerName="kserve-container" Apr 24 17:46:29.377954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377464 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" Apr 24 17:46:29.377954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377474 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" Apr 24 17:46:29.377954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377544 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" Apr 24 17:46:29.377954 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.377554 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9915a312-367a-405f-bf4d-49c11f4a4684" containerName="storage-initializer" Apr 24 17:46:29.380586 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.380566 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.382908 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.382878 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twkp6\"/\"openshift-service-ca.crt\"" Apr 24 17:46:29.382908 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.382906 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-twkp6\"/\"default-dockercfg-7txp2\"" Apr 24 17:46:29.383104 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.382969 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twkp6\"/\"kube-root-ca.crt\"" Apr 24 17:46:29.386983 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.386952 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/must-gather-qb959"] Apr 24 17:46:29.450454 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.450416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a-must-gather-output\") pod \"must-gather-qb959\" (UID: \"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a\") " pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.450454 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.450456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h926\" (UniqueName: \"kubernetes.io/projected/93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a-kube-api-access-9h926\") pod \"must-gather-qb959\" (UID: \"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a\") " pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.551931 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.551895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a-must-gather-output\") pod \"must-gather-qb959\" (UID: \"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a\") " pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.551931 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.551931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h926\" (UniqueName: \"kubernetes.io/projected/93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a-kube-api-access-9h926\") pod \"must-gather-qb959\" (UID: \"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a\") " pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.552235 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.552216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a-must-gather-output\") pod \"must-gather-qb959\" (UID: \"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a\") " pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.560393 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.560356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h926\" (UniqueName: \"kubernetes.io/projected/93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a-kube-api-access-9h926\") pod \"must-gather-qb959\" (UID: \"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a\") " pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.690261 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.690166 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/must-gather-qb959" Apr 24 17:46:29.809269 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:29.809237 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/must-gather-qb959"] Apr 24 17:46:29.812278 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:46:29.812239 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93e1c005_5c22_4cf5_9fc9_f38e6cd9b92a.slice/crio-678cdc9ef10951990ecc9c90c2f8537ef55a5dd08526f6dd924be46c9445db79 WatchSource:0}: Error finding container 678cdc9ef10951990ecc9c90c2f8537ef55a5dd08526f6dd924be46c9445db79: Status 404 returned error can't find the container with id 678cdc9ef10951990ecc9c90c2f8537ef55a5dd08526f6dd924be46c9445db79 Apr 24 17:46:30.497348 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:30.497296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/must-gather-qb959" event={"ID":"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a","Type":"ContainerStarted","Data":"678cdc9ef10951990ecc9c90c2f8537ef55a5dd08526f6dd924be46c9445db79"} Apr 24 17:46:31.502061 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:31.502023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/must-gather-qb959" event={"ID":"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a","Type":"ContainerStarted","Data":"1717623abddcd230c29139d63e48ccb9ccd748eed6d4f61ad4817ed817587f65"} Apr 24 17:46:31.502517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:31.502069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/must-gather-qb959" event={"ID":"93e1c005-5c22-4cf5-9fc9-f38e6cd9b92a","Type":"ContainerStarted","Data":"11d4e32418c287b9e56a8245305dee8efb27763133f6ab6ae96f6d0a124400b6"} Apr 24 17:46:31.518077 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:31.518007 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-twkp6/must-gather-qb959" podStartSLOduration=1.563778342 podStartE2EDuration="2.51798785s" podCreationTimestamp="2026-04-24 17:46:29 +0000 UTC" firstStartedPulling="2026-04-24 17:46:29.814006416 +0000 UTC m=+4048.078721792" lastFinishedPulling="2026-04-24 17:46:30.768215924 +0000 UTC m=+4049.032931300" observedRunningTime="2026-04-24 17:46:31.515698377 +0000 UTC m=+4049.780413777" watchObservedRunningTime="2026-04-24 17:46:31.51798785 +0000 UTC m=+4049.782703249" Apr 24 17:46:32.408732 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:32.408696 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cmffg_030a15c9-43fa-4bf6-a710-a4f8b1f3c7de/global-pull-secret-syncer/0.log" Apr 24 17:46:32.612691 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:32.612657 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-m2nnx_086451b6-0c60-4e42-8500-d8f31f29bb1e/konnectivity-agent/0.log" Apr 24 17:46:32.716083 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:32.716002 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-182.ec2.internal_63bf82c7cb78a8f9ba0c1c606e45e954/haproxy/0.log" Apr 24 17:46:36.452508 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.452432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fd685ffdd-zrbm5_334c37df-f60e-4d19-85bd-0aedb04d278b/metrics-server/0.log" Apr 24 17:46:36.534202 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.534135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65ts2_cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a/node-exporter/0.log" Apr 24 17:46:36.563974 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.563943 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65ts2_cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a/kube-rbac-proxy/0.log" Apr 24 17:46:36.599101 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.599070 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-65ts2_cb5b0f1f-7a2b-4ce1-ad76-bcc4f3aa2a4a/init-textfile/0.log" Apr 24 17:46:36.834058 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.834021 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pzh2f_460b36bf-3b60-407d-a1f4-a6660b7cb22f/kube-rbac-proxy-main/0.log" Apr 24 17:46:36.889404 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.889375 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pzh2f_460b36bf-3b60-407d-a1f4-a6660b7cb22f/kube-rbac-proxy-self/0.log" Apr 24 17:46:36.925173 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:36.925141 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pzh2f_460b36bf-3b60-407d-a1f4-a6660b7cb22f/openshift-state-metrics/0.log" Apr 24 17:46:39.259332 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.259277 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66c67549db-42lpr_0ee21b4c-e55e-4722-921c-f8e15ca79c5b/console/0.log" Apr 24 17:46:39.453881 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.453841 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg"] Apr 24 17:46:39.456590 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.456568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.462966 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.462934 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg"] Apr 24 17:46:39.542457 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.542359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-sys\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.542457 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.542406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-lib-modules\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.542693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.542473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkvf\" (UniqueName: \"kubernetes.io/projected/21278fd5-5095-4d05-a299-e3d8590bdafc-kube-api-access-cqkvf\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.542693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.542616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-podres\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.542693 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.542656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-proc\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643649 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-sys\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643812 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-lib-modules\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643812 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-sys\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643812 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643785 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-lib-modules\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643919 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkvf\" (UniqueName: \"kubernetes.io/projected/21278fd5-5095-4d05-a299-e3d8590bdafc-kube-api-access-cqkvf\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643953 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-podres\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.643986 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.643953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-proc\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.644045 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.644030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-proc\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.644081 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.644043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/21278fd5-5095-4d05-a299-e3d8590bdafc-podres\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.652381 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.652353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkvf\" (UniqueName: \"kubernetes.io/projected/21278fd5-5095-4d05-a299-e3d8590bdafc-kube-api-access-cqkvf\") pod \"perf-node-gather-daemonset-9cngg\" (UID: \"21278fd5-5095-4d05-a299-e3d8590bdafc\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.769635 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.769599 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:39.906257 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:39.906195 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg"] Apr 24 17:46:39.909632 ip-10-0-142-182 kubenswrapper[2573]: W0424 17:46:39.909587 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod21278fd5_5095_4d05_a299_e3d8590bdafc.slice/crio-0a1d634c00362bb0bc5000ca8f32f28e1d17ba97d9b522238d2fd86698c1dffa WatchSource:0}: Error finding container 0a1d634c00362bb0bc5000ca8f32f28e1d17ba97d9b522238d2fd86698c1dffa: Status 404 returned error can't find the container with id 0a1d634c00362bb0bc5000ca8f32f28e1d17ba97d9b522238d2fd86698c1dffa Apr 24 17:46:40.419630 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.419596 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-f6b8n_71bfbde9-7779-4c60-8a8a-0b238f76e255/dns/0.log" Apr 24 17:46:40.446278 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.446246 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-f6b8n_71bfbde9-7779-4c60-8a8a-0b238f76e255/kube-rbac-proxy/0.log" Apr 24 17:46:40.519606 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.519576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-djdhr_2267e8c5-f77e-4e17-a96f-9463ea75c147/dns-node-resolver/0.log" Apr 24 17:46:40.533559 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.533526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" event={"ID":"21278fd5-5095-4d05-a299-e3d8590bdafc","Type":"ContainerStarted","Data":"8e27d27b2aef3313df2b7fee0dcdd181e459727e23596f4c022bb70fdbf647ba"} Apr 24 17:46:40.533736 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.533562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" event={"ID":"21278fd5-5095-4d05-a299-e3d8590bdafc","Type":"ContainerStarted","Data":"0a1d634c00362bb0bc5000ca8f32f28e1d17ba97d9b522238d2fd86698c1dffa"} Apr 24 17:46:40.533736 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.533628 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:40.549064 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:40.549000 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" podStartSLOduration=1.5489804230000002 podStartE2EDuration="1.548980423s" podCreationTimestamp="2026-04-24 17:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:46:40.547674451 +0000 UTC m=+4058.812389849" watchObservedRunningTime="2026-04-24 17:46:40.548980423 +0000 UTC m=+4058.813695820" Apr 24 17:46:41.005842 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:41.005807 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54d688d7d7-nlm52_185b1c1c-1ad5-4891-831d-b68d86e99611/registry/0.log" Apr 24 17:46:41.055517 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:41.055476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-slcg8_6fd88de9-ce02-44e6-9d5c-0b5dbe13f0c4/node-ca/0.log" Apr 24 17:46:42.084661 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:42.084624 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6b27c_8ae78d18-eee9-4ff6-b5b1-81a6bd62493c/serve-healthcheck-canary/0.log" Apr 24 17:46:42.488361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:42.488257 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8whz2_bc890b75-e729-4d8c-8e1d-05bc27ad8717/kube-rbac-proxy/0.log" Apr 24 17:46:42.509175 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:42.509131 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8whz2_bc890b75-e729-4d8c-8e1d-05bc27ad8717/exporter/0.log" Apr 24 17:46:42.534481 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:42.534447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8whz2_bc890b75-e729-4d8c-8e1d-05bc27ad8717/extractor/0.log" Apr 24 17:46:44.778471 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:44.778440 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-ljksp_1a74b032-5164-4e07-a38f-776ab1d0eaf7/server/0.log" Apr 24 17:46:45.216852 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:45.216767 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-8697t_a88daa8b-b0c0-4e87-8d60-7ffed570349b/manager/0.log" Apr 24 17:46:45.315237 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:45.315200 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-kw295_086299da-19b1-4a79-9d64-909f24944518/seaweedfs/0.log" Apr 24 17:46:45.339397 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:45.339358 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-94zdb_9a4d53d1-2ea2-4940-8b98-2e7c9641b22a/seaweedfs-tls-custom/0.log" Apr 24 17:46:46.547500 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:46.547461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-9cngg" Apr 24 17:46:50.762634 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.762599 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/kube-multus-additional-cni-plugins/0.log" Apr 24 17:46:50.784202 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.784175 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/egress-router-binary-copy/0.log" Apr 24 17:46:50.807480 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.807447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/cni-plugins/0.log" Apr 24 17:46:50.829128 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.829101 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/bond-cni-plugin/0.log" Apr 24 17:46:50.849588 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.849555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/routeoverride-cni/0.log" Apr 24 17:46:50.869688 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.869657 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/whereabouts-cni-bincopy/0.log" Apr 24 17:46:50.890430 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.890400 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x52vd_85c3e0af-8027-49c2-937d-99acbd5f7085/whereabouts-cni/0.log" Apr 24 17:46:50.935771 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.935739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tkpx8_df6acdc8-67c6-4733-b49a-03a69f37ba5b/kube-multus/0.log" Apr 24 17:46:50.960361 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.960328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-74mjh_3052a162-5d36-4309-9bd0-bca01410b715/network-metrics-daemon/0.log" Apr 24 17:46:50.976244 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:50.976216 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-74mjh_3052a162-5d36-4309-9bd0-bca01410b715/kube-rbac-proxy/0.log" Apr 24 17:46:52.401407 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.401365 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-controller/0.log" Apr 24 17:46:52.418605 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.418568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/0.log" Apr 24 17:46:52.454923 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.454883 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovn-acl-logging/1.log" Apr 24 17:46:52.478937 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.478900 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/kube-rbac-proxy-node/0.log" Apr 24 17:46:52.501215 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.501172 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:46:52.517877 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.517844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/northd/0.log" Apr 24 17:46:52.539142 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.539110 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/nbdb/0.log" Apr 24 17:46:52.560959 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.560925 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/sbdb/0.log" Apr 24 17:46:52.758474 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:52.758442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkfqt_86788d4c-c402-4057-988b-77279d8fd61c/ovnkube-controller/0.log" Apr 24 17:46:53.787358 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:53.787302 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mgmm7_4f65e195-bb0b-4b16-893b-21667f13f3a5/network-check-target-container/0.log" Apr 24 17:46:54.611793 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:54.611759 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-75zzc_f5834b4f-9b4a-49b9-9cee-068a23d3d4a8/iptables-alerter/0.log" Apr 24 17:46:55.268449 ip-10-0-142-182 kubenswrapper[2573]: I0424 17:46:55.268407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hz7p4_0217b78c-e8f2-479d-a268-a3e3fad5f9b6/tuned/0.log"